HR Analytics

About: HR analytics is revolutionising the way human resources departments operate, leading to higher efficiency and better results overall. Human resources has been using analytics for years. However, the collection, processing and analysis of data has been largely manual, and given the nature of human resources dynamics and HR KPIs, the approach has been constraining HR. Therefore, it is surprising that HR departments woke up to the utility of machine learning so late in the game. Here is an opportunity to try predictive analytics in identifying the employees most likely to get promoted.

Problem Statement: Your client is a large MNC and they have 9 broad verticals across the organisation. One of the problem your client is facing is around identifying the right people for promotion (only for manager position and below) and prepare them in time. Currently the process, they are following is:

  • They first identify a set of employees based on recommendations/ past performance
  • Selected employees go through the separate training and evaluation program for each vertical. These programs are based on the required skill of each vertical
  • At the end of the program, based on various factors such as training performance, KPI completion (only employees with KPIs completed greater than 60% are considered) etc., employee gets promotion
  • For above mentioned process, the final promotions are only announced after the evaluation and this leads to delay in transition to their new roles. Hence, company needs your help in identifying the eligible candidates at a particular checkpoint so that they can expedite the entire promotion cycle.

For above mentioned process, the final promotions are only announced after the evaluation and this leads to delay in transition to their new roles. Hence, company needs your help in identifying the eligible candidates at a particular checkpoint so that they can expedite the entire promotion cycle.

They have provided multiple attributes around Employee's past and current performance along with demographics. Now, The task is to predict whether a potential promotee at checkpoint in the test set will be promoted or not after the evaluation process.

Attributes Information:

  1. employee_id:Unique ID for employee
  2. department:Department of employee
  3. region:Region of employment (unordered)
  4. education:Education Level
  5. gender:Gender of Employee
  6. recruitment_channel:Channel of recruitment for employee
  7. no_of_trainings:no of other trainings completed in previous year on soft skills, technical skills etc.
  8. age:Age of Employee
  9. previous_year_rating:Employee Rating for the previous year
  10. length_of_service:Length of service in years
  11. KPIs_met >80%:if Percent of KPIs(Key performance Indicators) >80% then 1 else 0
  12. awards_won?:if awards won during previous year then 1 else 0
  13. avg_training_score:Average score in current training evaluations
  14. is_promoted (Target):Recommended for promotion

Evaluation Metric: F1-Score.

Importing libraries

In [18]:
import numpy as np
import pandas as pd

import seaborn as sns
sns.set_style('whitegrid')
import matplotlib.pyplot as plt
import sweetviz as sv
from pandas_profiling import ProfileReport

import statistics

from scipy.stats import skew, norm

import warnings
warnings.filterwarnings('ignore')

pd.set_option('display.max_columns',None)  
pd.set_option('display.expand_frame_repr',False)
pd.set_option('max_colwidth',-1)

from sklearn.model_selection import train_test_split
from sklearn.metrics import classification_report, confusion_matrix, accuracy_score, f1_score
from sklearn.model_selection import GridSearchCV
from sklearn.preprocessing import OneHotEncoder

from sklearn.naive_bayes import GaussianNB
from sklearn.linear_model import LogisticRegression
from sklearn.neighbors import KNeighborsClassifier
from sklearn.tree import DecisionTreeClassifier
from sklearn.svm import SVC
from sklearn.ensemble import RandomForestClassifier, GradientBoostingClassifier
from xgboost import XGBClassifier
from sklearn.ensemble import AdaBoostClassifier
from catboost import CatBoostClassifier
from lightgbm import LGBMClassifier
from mlxtend.classifier import EnsembleVoteClassifier
from mlxtend.classifier import StackingClassifier

Reading datasets

In [2]:
train=pd.read_csv('train_LZdllcl.csv')
test=pd.read_csv('test_2umaH9m.csv')
submission=pd.read_csv('sample_submission_M0L0uXE.csv')

Taking a look through datasets

In [3]:
train
Out[3]:
employee_id department region education gender recruitment_channel no_of_trainings age previous_year_rating length_of_service KPIs_met >80% awards_won? avg_training_score is_promoted
0 65438 Sales & Marketing region_7 Master's & above f sourcing 1 35 5.0 8 1 0 49 0
1 65141 Operations region_22 Bachelor's m other 1 30 5.0 4 0 0 60 0
2 7513 Sales & Marketing region_19 Bachelor's m sourcing 1 34 3.0 7 0 0 50 0
3 2542 Sales & Marketing region_23 Bachelor's m other 2 39 1.0 10 0 0 50 0
4 48945 Technology region_26 Bachelor's m other 1 45 3.0 2 0 0 73 0
... ... ... ... ... ... ... ... ... ... ... ... ... ... ...
54803 3030 Technology region_14 Bachelor's m sourcing 1 48 3.0 17 0 0 78 0
54804 74592 Operations region_27 Master's & above f other 1 37 2.0 6 0 0 56 0
54805 13918 Analytics region_1 Bachelor's m other 1 27 5.0 3 1 0 79 0
54806 13614 Sales & Marketing region_9 NaN m sourcing 1 29 1.0 2 0 0 45 0
54807 51526 HR region_22 Bachelor's m other 1 27 1.0 5 0 0 49 0

54808 rows × 14 columns

In [4]:
test
Out[4]:
employee_id department region education gender recruitment_channel no_of_trainings age previous_year_rating length_of_service KPIs_met >80% awards_won? avg_training_score
0 8724 Technology region_26 Bachelor's m sourcing 1 24 NaN 1 1 0 77
1 74430 HR region_4 Bachelor's f other 1 31 3.0 5 0 0 51
2 72255 Sales & Marketing region_13 Bachelor's m other 1 31 1.0 4 0 0 47
3 38562 Procurement region_2 Bachelor's f other 3 31 2.0 9 0 0 65
4 64486 Finance region_29 Bachelor's m sourcing 1 30 4.0 7 0 0 61
... ... ... ... ... ... ... ... ... ... ... ... ... ...
23485 53478 Legal region_2 Below Secondary m sourcing 1 24 3.0 1 0 0 61
23486 25600 Technology region_25 Bachelor's m sourcing 1 31 3.0 7 0 0 74
23487 45409 HR region_16 Bachelor's f sourcing 1 26 4.0 4 0 0 50
23488 1186 Procurement region_31 Bachelor's m sourcing 3 27 NaN 1 0 0 70
23489 5973 Technology region_17 Master's & above m other 3 40 5.0 5 1 0 89

23490 rows × 13 columns

In [5]:
train.info()
<class 'pandas.core.frame.DataFrame'>
RangeIndex: 54808 entries, 0 to 54807
Data columns (total 14 columns):
 #   Column                Non-Null Count  Dtype  
---  ------                --------------  -----  
 0   employee_id           54808 non-null  int64  
 1   department            54808 non-null  object 
 2   region                54808 non-null  object 
 3   education             52399 non-null  object 
 4   gender                54808 non-null  object 
 5   recruitment_channel   54808 non-null  object 
 6   no_of_trainings       54808 non-null  int64  
 7   age                   54808 non-null  int64  
 8   previous_year_rating  50684 non-null  float64
 9   length_of_service     54808 non-null  int64  
 10  KPIs_met >80%         54808 non-null  int64  
 11  awards_won?           54808 non-null  int64  
 12  avg_training_score    54808 non-null  int64  
 13  is_promoted           54808 non-null  int64  
dtypes: float64(1), int64(8), object(5)
memory usage: 5.9+ MB
In [6]:
test.info()
<class 'pandas.core.frame.DataFrame'>
RangeIndex: 23490 entries, 0 to 23489
Data columns (total 13 columns):
 #   Column                Non-Null Count  Dtype  
---  ------                --------------  -----  
 0   employee_id           23490 non-null  int64  
 1   department            23490 non-null  object 
 2   region                23490 non-null  object 
 3   education             22456 non-null  object 
 4   gender                23490 non-null  object 
 5   recruitment_channel   23490 non-null  object 
 6   no_of_trainings       23490 non-null  int64  
 7   age                   23490 non-null  int64  
 8   previous_year_rating  21678 non-null  float64
 9   length_of_service     23490 non-null  int64  
 10  KPIs_met >80%         23490 non-null  int64  
 11  awards_won?           23490 non-null  int64  
 12  avg_training_score    23490 non-null  int64  
dtypes: float64(1), int64(7), object(5)
memory usage: 2.3+ MB

We can see there are some NaN values in variables such as education and previous_year_rating in both train and test sets.

In [7]:
print(train.shape)
for i in train.columns.values:
    print (i)
    print (len(train[i].unique()))
    print("----------")
(54808, 14)
employee_id
54808
----------
department
9
----------
region
34
----------
education
4
----------
gender
2
----------
recruitment_channel
3
----------
no_of_trainings
10
----------
age
41
----------
previous_year_rating
6
----------
length_of_service
35
----------
KPIs_met >80%
2
----------
awards_won?
2
----------
avg_training_score
61
----------
is_promoted
2
----------
In [8]:
print(test.shape)
for i in test.columns.values:
    print (i)
    print (len(test[i].unique()))
    print("----------")
(23490, 13)
employee_id
23490
----------
department
9
----------
region
34
----------
education
4
----------
gender
2
----------
recruitment_channel
3
----------
no_of_trainings
9
----------
age
41
----------
previous_year_rating
6
----------
length_of_service
34
----------
KPIs_met >80%
2
----------
awards_won?
2
----------
avg_training_score
61
----------

The levels of categorical variables in both train and test sets are equal.

In [9]:
train[train.isnull().any(axis=1)]
Out[9]:
employee_id department region education gender recruitment_channel no_of_trainings age previous_year_rating length_of_service KPIs_met >80% awards_won? avg_training_score is_promoted
10 29934 Technology region_23 NaN m sourcing 1 30 NaN 1 0 0 77 0
21 33332 Operations region_15 NaN m sourcing 1 41 4.0 11 0 0 57 0
23 71177 Procurement region_5 Bachelor's m other 1 27 NaN 1 0 0 70 0
29 74759 Sales & Marketing region_4 Bachelor's m sourcing 1 26 NaN 1 0 0 44 0
32 35465 Sales & Marketing region_7 NaN f sourcing 1 24 1.0 2 0 0 48 0
... ... ... ... ... ... ... ... ... ... ... ... ... ... ...
54742 38935 Sales & Marketing region_31 NaN m other 1 28 4.0 3 0 0 47 0
54746 10546 Finance region_6 Bachelor's m other 1 28 NaN 1 1 0 61 0
54773 37919 Finance region_2 Bachelor's m other 1 23 NaN 1 1 0 61 0
54801 12431 Technology region_26 Bachelor's f sourcing 1 31 NaN 1 0 0 78 0
54806 13614 Sales & Marketing region_9 NaN m sourcing 1 29 1.0 2 0 0 45 0

6148 rows × 14 columns

In [10]:
test[test.isnull().any(axis=1)]
Out[10]:
employee_id department region education gender recruitment_channel no_of_trainings age previous_year_rating length_of_service KPIs_met >80% awards_won? avg_training_score
0 8724 Technology region_26 Bachelor's m sourcing 1 24 NaN 1 1 0 77
21 5677 Technology region_17 Bachelor's m sourcing 1 25 NaN 1 0 0 80
32 67672 Technology region_17 Bachelor's m other 1 29 NaN 1 1 0 85
39 55325 Analytics region_22 Bachelor's m other 1 25 NaN 1 0 0 88
47 44159 Analytics region_22 Master's & above m other 1 31 NaN 1 1 0 84
... ... ... ... ... ... ... ... ... ... ... ... ... ...
23452 65429 Analytics region_15 NaN m sourcing 1 30 2.0 7 0 0 83
23459 30477 Sales & Marketing region_22 NaN m other 1 31 2.0 7 0 0 52
23479 39410 Sales & Marketing region_2 Bachelor's m other 3 20 NaN 1 0 0 49
23482 27284 Sales & Marketing region_2 NaN m sourcing 2 44 4.0 4 0 0 49
23488 1186 Procurement region_31 Bachelor's m sourcing 3 27 NaN 1 0 0 70

2671 rows × 13 columns

In [11]:
sns.countplot(x='is_promoted',data=train,palette='bright');

Pandas Profiling

The target variable's class is highly imbalance.

Generates profile reports from a pandas DataFrame. The pandas df.describe() function is great but a little basic for serious exploratory data analysis. pandas_profiling extends the pandas DataFrame with df.profile_report() for quick data analysis.

For each column the following statistics - if relevant for the column type - are presented in an interactive HTML report:

  • Type inference: detect the types of columns in a dataframe.
  • Essentials: type, unique values, missing values
  • Quantile statistics like minimum value, Q1, median, Q3, maximum, range, interquartile range
  • Descriptive statistics like mean, mode, standard deviation, sum, median absolute deviation, coefficient of variation, kurtosis, skewness
  • Most frequent values
  • Histogram
  • Correlations highlighting of highly correlated variables, Spearman, Pearson and Kendall matrices
  • Missing values matrix, count, heatmap and dendrogram of missing values
  • Text analysis learn about categories (Uppercase, Space), scripts (Latin, Cyrillic) and blocks (ASCII) of text data.
  • File and Image analysis extract file sizes, creation dates and dimensions and scan for truncated images or those containing EXIF information.

On train set

In [22]:
profile_train=ProfileReport(train,title='Train',explorative=True)
profile_train



Out[22]:

On test set

In [23]:
profile_test=ProfileReport(test,title='Test',explorative=True)
profile_test



Out[23]:

Imputations

On train set

In [12]:
statistics.mode(train['education'])
Out[12]:
"Bachelor's"
In [13]:
statistics.mode(train['previous_year_rating'])
Out[13]:
3.0
In [14]:
train['education'].fillna("Bachelor's",inplace=True)
In [15]:
train['previous_year_rating'].fillna(3.0,inplace=True)

On test set

In [16]:
statistics.mode(test['education'])
Out[16]:
"Bachelor's"
In [17]:
statistics.mode(test['previous_year_rating'])
Out[17]:
3.0
In [18]:
test['education'].fillna("Bachelor's",inplace=True)
In [19]:
test['previous_year_rating'].fillna(3.0,inplace=True)

EDA

In [20]:
sns.countplot(x='education',data=train,palette='dark');
In [21]:
f,ax=plt.subplots(figsize=(6,4))

sns.countplot(x='department',data=train,palette='dark');
ax.set_xticklabels(ax.get_xticklabels(),rotation=30);
In [22]:
fig,(axis1,axis2)=plt.subplots(1,2,figsize=(15,5));

sns.barplot(x='KPIs_met >80%',y='age',hue='education',palette='rainbow',data=train,ax=axis1);
sns.barplot(x='awards_won?',y='age',hue='education',palette='rainbow',data=train,ax=axis2);

We see similar patterns in age and education of an employee when compared to KPIs_met >80% and awards_won.

In [23]:
f,ax=plt.subplots(figsize=(20,10))
g=sns.boxplot(x='department',y='avg_training_score',hue='education',data=train,palette='bright');

We can observe that the departments like technology, analytics and R&D have high average training score. Both analytics and technology departments have employees which minimum have bachelor's degree. A very strange thing we can see is in sales and marketing the variance is very large even after there least education is bachelor's degree.

In [24]:
f,ax=plt.subplots(figsize=(8,10))
sns.barplot(x='no_of_trainings',y='avg_training_score',hue='education',palette='rainbow',data=train);

Employees that have below secondary education, don't have more than 4 trainings.

In [25]:
f,ax=plt.subplots(figsize=(8,6))
g=sns.boxplot(x='education',y='age',hue='gender',data=train,palette='colorblind');
In [26]:
fig,(axis1,axis2)=plt.subplots(1,2,figsize=(15,5));

sns.stripplot(x='previous_year_rating',y='age',data=train,hue='awards_won?',jitter=True,ax=axis1);
sns.stripplot(x='previous_year_rating',y='avg_training_score',data=train,hue='awards_won?',jitter=True,ax=axis2);

Whether the employee will win awards has very little to do with age and previous year rating but according to training score we see more the score, award can be won.

In [27]:
g=sns.FacetGrid(data=train,col='recruitment_channel');
g.map(plt.hist,'age');

Age in each plot is same but we see recruitments from channel other is more.Through sourcing, there is slightly less recruits and then there are very less through reference.

In [28]:
f,ax=plt.subplots(figsize=(20,10))
sns.stripplot(x='department',y='avg_training_score',data=train,hue='is_promoted',jitter=True,palette='bright');

Clearly high average training score is a factor to get promotion, but we can also see some promotions where the training score is low. Surprisingly r&d department have got employees with minimum promotions apart from having good training score and education level. Most no. of promotions can be seen in sales and marketing, operations and procurement.

In [29]:
f,ax=plt.subplots(figsize=(20,10))
g=sns.boxplot(x='region',y='avg_training_score',hue='is_promoted',data=train);
ax.set_xticklabels(ax.get_xticklabels(),rotation=30);

Clearly region has nothing to do with promotion.

In [30]:
g=sns.stripplot(x='KPIs_met >80%',y='avg_training_score',hue='is_promoted',data=train,palette='deep',jitter=True);

There is a higher chance of getting promoted if the employee falls in a category where he/she has a Key performance Indicator greater that 80%.

In [31]:
f,ax=plt.subplots(figsize=(7,7))
sns.stripplot(x='recruitment_channel',y='avg_training_score',hue='is_promoted',palette='BuPu',data=train,jitter=True);

Employees recruited from reference have less no. of promotions.

In [32]:
fig,(axis1,axis2)=plt.subplots(1,2,figsize=(15,5));

g=sns.stripplot(x='education',y='length_of_service',hue='gender',data=train,ax=axis1,palette='muted');
g=sns.stripplot(x='department',y='length_of_service',hue='gender',data=train,ax=axis2,palette='muted');
axis2.set_xticklabels(axis2.get_xticklabels(),rotation=30);
  • In first plot we see employess of education below secondary, have very less length of service.

  • If departments are concerned only analytics, r&d and legal have less no. of service in years and surprisingly there are very less no. of female employees in those departments. STRANGE

Transformation

On train set

In [33]:
sns.set_color_codes(palette='deep')
f,ax=plt.subplots(figsize=(5,5))

print("Skewness: %f" % train['age'].skew())
print("Kurtosis: %f" % train['age'].kurt())

sns.distplot(train['age'],color="b");
ax.xaxis.grid(False)
ax.set(ylabel="Frequency")
ax.set(xlabel="Age")
ax.set(title="Age distribution")
sns.despine(trim=True,left=True)
plt.show();
Skewness: 1.007432
Kurtosis: 0.792353

It is Skewed to right, so we will apply a log(1+x) transformation to fix the skew.

In [34]:
train["age_trans"]=np.log1p(train["age"])
In [35]:
sns.set_color_codes(palette='deep')
f,ax=plt.subplots(figsize=(5,5))

sns.distplot(train['age_trans'],fit=norm,color="b");

print("Skewness: %f" % train['age_trans'].skew())
print("Kurtosis: %f" % train['age_trans'].kurt())

ax.xaxis.grid(False)
ax.set(ylabel="Frequency")
ax.set(xlabel="Age")
ax.set(title="Age distribution")
sns.despine(trim=True,left=True)

plt.show()
Skewness: 0.496626
Kurtosis: -0.086587

Our Kurtosis is -0.0865 which is small. Similarly the Skewness is 0.49 which is Symmetrical.

On test set

In [36]:
sns.set_color_codes(palette='deep')
f,ax=plt.subplots(figsize=(5,5))

print("Skewness: %f" % test['age'].skew())
print("Kurtosis: %f" % test['age'].kurt())

sns.distplot(train['age'],color="b");
ax.xaxis.grid(False)
ax.set(ylabel="Frequency")
ax.set(xlabel="Age")
ax.set(title="Age distribution")
sns.despine(trim=True,left=True)
plt.show();
Skewness: 1.011777
Kurtosis: 0.792598
In [37]:
test["age_trans"]=np.log1p(test["age"])
In [38]:
sns.set_color_codes(palette='deep')
f,ax=plt.subplots(figsize=(5,5))

sns.distplot(train['age_trans'],fit=norm,color="b");

print("Skewness: %f" % test['age_trans'].skew())
print("Kurtosis: %f" % test['age_trans'].kurt())

ax.xaxis.grid(False)
ax.set(ylabel="Frequency")
ax.set(xlabel="Age")
ax.set(title="Age distribution")
sns.despine(trim=True,left=True)

plt.show()
Skewness: 0.500978
Kurtosis: -0.084135

Our Kurtosis is -0.0841 which is small. Similarly the Skewness is 0.500 which is Symmetrical.

Label Encoding

The education attribute is ordinal, we will map them with nos.

On train set

In [39]:
map_edu_tr={"Below Secondary":1,"Bachelor's":2,"Master's & above":3}
for dataset in train:
    train['education_ord']=train['education'].map(map_edu_tr)

On test set

In [40]:
map_edu_te={"Below Secondary":1,"Bachelor's":2,"Master's & above":3}
for dataset in test:
    test['education_ord']=test['education'].map(map_edu_te)
In [41]:
train=train.drop(columns=['education'])
test=test.drop(columns=['education'])

Dummification of Categorical Variables

On train set

In [42]:
dummy_tr_1=pd.get_dummies(train['department'],drop_first=True,prefix='dep',prefix_sep='_')
In [43]:
dummy_tr_2=pd.get_dummies(train['gender'],drop_first=True,prefix='gen',prefix_sep='_')
In [44]:
dummy_tr_3=pd.get_dummies(train['recruitment_channel'],drop_first=True,prefix='rc',prefix_sep='_')
In [45]:
dummy_tr_4=pd.get_dummies(train['KPIs_met >80%'],drop_first=True,prefix='kp',prefix_sep='_')
In [46]:
dummy_tr_5=pd.get_dummies(train['awards_won?'],drop_first=True,prefix='aw',prefix_sep='_')
In [47]:
train=pd.concat([train,dummy_tr_1,dummy_tr_2,dummy_tr_3,
                 dummy_tr_4,dummy_tr_5],axis=1)
In [48]:
train=train.drop(columns=['department','gender','recruitment_channel',
                          'KPIs_met >80%','awards_won?'])

On test set

In [49]:
dummy_te_1=pd.get_dummies(test['department'],drop_first=True,prefix='dep',prefix_sep='_')
In [50]:
dummy_te_2=pd.get_dummies(test['gender'],drop_first=True,prefix='gen',prefix_sep='_')
In [51]:
dummy_te_3=pd.get_dummies(test['recruitment_channel'],drop_first=True,prefix='rc',prefix_sep='_')
In [52]:
dummy_te_4=pd.get_dummies(test['KPIs_met >80%'],drop_first=True,prefix='kp',prefix_sep='_')
In [53]:
dummy_te_5=pd.get_dummies(test['awards_won?'],drop_first=True,prefix='aw',prefix_sep='_')
In [54]:
test=pd.concat([test,dummy_te_1,dummy_te_2,dummy_te_3,dummy_te_4,dummy_te_5],axis=1)
In [55]:
test=test.drop(columns=['department','gender','recruitment_channel',
                        'KPIs_met >80%','awards_won?'])

For our modelling we will split our train set in train-validation set in 67-33% proportion. But first we will assign Indpendent and Dependent variables.

Setting Independent and Dependent Variables

In [56]:
X=train.drop(columns=['employee_id','is_promoted','age','region'])
y=train['is_promoted']

Train - Validation Split

In [57]:
X_train,X_val,y_train,y_val=train_test_split(X,y,test_size=0.33,
                                             random_state=44)
In [58]:
X_train.shape
Out[58]:
(36721, 19)
In [59]:
X_train
Out[59]:
no_of_trainings previous_year_rating length_of_service avg_training_score age_trans education_ord dep_Finance dep_HR dep_Legal dep_Operations dep_Procurement dep_R&D dep_Sales & Marketing dep_Technology gen_m rc_referred rc_sourcing kp_1 aw_1
17041 1 3.0 1 45 3.332205 2 0 1 0 0 0 0 0 0 0 0 0 0 0
3800 1 2.0 5 49 3.850148 3 0 1 0 0 0 0 0 0 0 0 0 1 0
5867 1 2.0 7 48 3.637586 3 0 0 0 0 0 0 1 0 1 0 0 0 0
43501 1 5.0 3 63 3.637586 2 0 0 0 1 0 0 0 0 1 0 1 1 0
10718 1 3.0 1 50 3.332205 2 0 0 0 0 0 0 1 0 1 0 0 0 0
... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ...
53123 1 3.0 3 50 3.401197 2 0 0 0 0 0 0 1 0 1 0 1 0 0
49723 1 2.0 2 46 3.850148 3 0 0 0 0 0 0 1 0 0 0 0 0 0
25773 3 3.0 2 50 3.332205 2 0 0 0 0 0 0 1 0 1 0 0 1 0
3491 3 3.0 10 62 3.637586 2 0 0 0 1 0 0 0 0 1 1 0 0 0
14100 1 3.0 1 62 3.555348 2 0 0 0 1 0 0 0 0 1 0 0 0 0

36721 rows × 19 columns

Modelling

Logistic Regression

In [62]:
classifier_LR=LogisticRegression()
classifier_LR.fit(X_train,y_train)
y_pred_LR_tr=classifier_LR.predict(X_train)
y_pred_LR_val=classifier_LR.predict(X_val)

f1_lr_tr=f1_score(y_train,y_pred_LR_tr)
f1_lr_val=f1_score(y_val,y_pred_LR_val)
In [63]:
print('Train F1-Score')
print(f1_lr_tr*100)
print('Validation F1-Score')
print(f1_lr_val*100)
Train F1-Score
27.357901311680198
Validation F1-Score
25.333333333333336

KNN

In [64]:
classifier_KNN=KNeighborsClassifier()
classifier_KNN.fit(X_train,y_train)
y_pred_KNN_tr=classifier_KNN.predict(X_train)
y_pred_KNN_val=classifier_KNN.predict(X_val)

f1_knn_tr=f1_score(y_train,y_pred_KNN_tr)
f1_knn_val=f1_score(y_val,y_pred_KNN_val)
In [65]:
print('Train F1-Score')
print(f1_knn_tr*100)
print('Validation F1-Score')
print(f1_knn_val*100)
Train F1-Score
46.103268226648616
Validation F1-Score
34.84444444444445

Decision Tree

In [66]:
classifier_DT=DecisionTreeClassifier()
classifier_DT.fit(X_train,y_train)
y_pred_DT_tr=classifier_DT.predict(X_train)
y_pred_DT_val=classifier_DT.predict(X_val)

f1_dt_tr=f1_score(y_train,y_pred_DT_tr)
f1_dt_val=f1_score(y_val,y_pred_DT_val)
In [67]:
print('Train F1-Score')
print(f1_dt_tr*100)
print('Validation F1-Score')
print(f1_dt_val*100)
Train F1-Score
98.00425758382119
Validation F1-Score
41.316073354908305

Grid Search CV

In [68]:
classifier_DT_GS=DecisionTreeClassifier()
params_grid_DT={'criterion':['gini','entropy'],
                'splitter':['best','random'],
                'max_depth':[5,10,15],
                'min_samples_split':[2,3,4,5],
                'min_samples_leaf':[1,2,3,4],
                'class_weight':['balanced',None]}
grid_search_DT=GridSearchCV(classifier_DT_GS,params_grid_DT,
                            n_jobs=-1,scoring='f1')
In [69]:
grid_search_DT.fit(X_train,y_train)

grid_search_DT.best_params_
Out[69]:
{'class_weight': None,
 'criterion': 'entropy',
 'max_depth': 15,
 'min_samples_leaf': 3,
 'min_samples_split': 3,
 'splitter': 'random'}
In [70]:
classifier_DT_after=DecisionTreeClassifier(class_weight=None,criterion='entropy',max_depth=15,min_samples_leaf=3,min_samples_split=3,splitter='random')
classifier_DT_after.fit(X_train,y_train)
y_pred_DT_after_val=classifier_DT_after.predict(X_val)
y_pred_DT_after_tr=classifier_DT_after.predict(X_train)

f1_dt_tuned_val=f1_score(y_val,y_pred_DT_after_val)
f1_dt_tuned_tr=f1_score(y_train,y_pred_DT_after_tr)
In [71]:
print('Train F1-Score')
print(f1_dt_tuned_tr*100)
print('Validation F1-Score')
print(f1_dt_tuned_val*100)
Train F1-Score
54.89408327246166
Validation F1-Score
48.32104832104832

Random Forest

In [72]:
classifier_RF=RandomForestClassifier()
classifier_RF.fit(X_train,y_train)
y_pred_RF_tr=classifier_RF.predict(X_train)
y_pred_RF_val=classifier_RF.predict(X_val)

f1_rf_tr=f1_score(y_train,y_pred_RF_tr)
f1_rf_val=f1_score(y_val,y_pred_RF_val)
In [73]:
print('Train F1-Score')
print(f1_rf_tr*100)
print('Validation F1-Score')
print(f1_rf_val*100)
Train F1-Score
98.00132362673726
Validation F1-Score
47.049567269866245

Gradient Boosting

In [83]:
gb=GradientBoostingClassifier(n_estimators=300,max_features=0.9,learning_rate=0.25,max_depth=4,
                              min_samples_leaf=2,subsample=1,verbose=0,random_state=12)
gb.fit(X_train,y_train)
y_pred_gb_val=gb.predict(X_val)
y_pred_gb_tr=gb.predict(X_train)

f1_gb_val=f1_score(y_val,y_pred_gb_val)
f1_gb_tr=f1_score(y_train,y_pred_gb_tr)
In [84]:
print('Train F1-Score')
print(f1_gb_tr*100)
print('Validation F1-Score')
print(f1_gb_val*100)
Train F1-Score
58.61566484517304
Validation F1-Score
50.545759865659114

XGBoost

In [75]:
xgb=XGBClassifier(learning_rate=0.1,n_estimators=150,max_depth=5,min_child_weight=5,gamma=0.3,nthread=8,subsample=0.8,
                  colsample_bytree=0.8,objective='binary:logistic',scale_pos_weight=3,seed=12)
xgb.fit(X_train,y_train)
y_pred_xgb_val=xgb.predict(X_val)
y_pred_xgb_tr=xgb.predict(X_train)

f1_xgb_val=f1_score(y_val,y_pred_xgb_val)
f1_xgb_tr=f1_score(y_train,y_pred_xgb_tr)
In [76]:
print('Train F1-Score')
print(f1_xgb_tr*100)
print('Validation F1-Score')
print(f1_xgb_val*100)
Train F1-Score
57.127232956616844
Validation F1-Score
51.417004048583

Grid Search CV

In [47]:
XGB_GS=XGBClassifier()
params_grid_XGB={'booster':['gbtree','dart'],
                 'learning_rate':[0.01,0.1,1],
                 'gamma':[0.1,0.5,1],
                 'max_depth':[7,8,9,10],
                 'min_child_weight':[1,3,5],
                 'subsample':[0.2,0.4,0.6,0.8],
                 'colsample_bytree':[0.6,0.8,1],
                 'scale_pos_weight':[2,3,4],
                 'feature_selector':['cyclic','shuffle','random','greedy','thrifty']}
grid_search_XGB=GridSearchCV(XGB_GS,params_grid_XGB,
                             n_jobs=-1,scoring='f1')
In [ ]:
grid_search_XGB.fit(X_train,y_train)

grid_search_XGB.best_params_
In [71]:
XGB_after=XGBClassifier(earning_rate=0.01,n_estimators=200,max_depth=5,min_child_weight=5,gamma=0.2,nthread=8,subsample=0.7,
                        colsample_bytree=0.7,objective='binary:logistic',scale_pos_weight=2,feature_selector='cyclic')
XGB_after.fit(X_train,y_train)
y_pred_XGB_after_val=XGB_after.predict(X_val)
y_pred_XGB_after_tr=XGB_after.predict(X_train)

f1_xgb_tuned_val=f1_score(y_val,y_pred_XGB_after_val)
f1_xgb_tuned_tr=f1_score(y_train,y_pred_XGB_after_tr)
In [72]:
print('Train F1-Score')
print(f1_xgb_tuned_tr*100)
print('Validation F1-Score')
print(f1_xgb_tuned_val*100)
Train F1-Score
63.274919917090635
Validation F1-Score
50.23334747560458

ADA Boost

In [60]:
ADA=AdaBoostClassifier()
ADA.fit(X_train,y_train)
y_pred_ADA_val=ADA.predict(X_val)
y_pred_ADA_tr=ADA.predict(X_train)

f1_ada_val=f1_score(y_val,y_pred_ADA_val)
f1_ada_tr=f1_score(y_train,y_pred_ADA_tr)
In [61]:
print('Train F1-Score')
print(f1_ada_tr*100)
print('Validation F1-Score')
print(f1_ada_val*100)
Train F1-Score
26.548223350253807
Validation F1-Score
26.795895096921324

Cat Boost

In [84]:
CB=CatBoostClassifier(max_depth=5)
CB.fit(X_train,y_train)
y_pred_CB_val=CB.predict(X_val)
y_pred_CB_tr=CB.predict(X_train)

f1_cb_val=f1_score(y_val,y_pred_CB_val)
f1_cb_tr=f1_score(y_train,y_pred_CB_tr)
Learning rate set to 0.04799
0:	learn: 0.6347130	total: 15ms	remaining: 14.9s
1:	learn: 0.5941143	total: 26.2ms	remaining: 13.1s
2:	learn: 0.5430656	total: 41.1ms	remaining: 13.7s
3:	learn: 0.5075566	total: 55.7ms	remaining: 13.9s
4:	learn: 0.4804532	total: 70.4ms	remaining: 14s
5:	learn: 0.4527066	total: 84.7ms	remaining: 14s
6:	learn: 0.4145045	total: 116ms	remaining: 16.5s
7:	learn: 0.3950372	total: 132ms	remaining: 16.3s
8:	learn: 0.3755908	total: 148ms	remaining: 16.3s
9:	learn: 0.3549981	total: 163ms	remaining: 16.1s
10:	learn: 0.3438447	total: 181ms	remaining: 16.3s
11:	learn: 0.3330831	total: 196ms	remaining: 16.1s
12:	learn: 0.3215344	total: 213ms	remaining: 16.2s
13:	learn: 0.3111697	total: 230ms	remaining: 16.2s
14:	learn: 0.3016862	total: 245ms	remaining: 16.1s
15:	learn: 0.2945987	total: 261ms	remaining: 16s
16:	learn: 0.2862004	total: 276ms	remaining: 15.9s
17:	learn: 0.2793920	total: 291ms	remaining: 15.9s
18:	learn: 0.2712276	total: 307ms	remaining: 15.9s
19:	learn: 0.2667615	total: 322ms	remaining: 15.8s
20:	learn: 0.2631148	total: 337ms	remaining: 15.7s
21:	learn: 0.2591850	total: 357ms	remaining: 15.9s
22:	learn: 0.2538057	total: 372ms	remaining: 15.8s
23:	learn: 0.2496549	total: 397ms	remaining: 16.1s
24:	learn: 0.2471066	total: 412ms	remaining: 16s
25:	learn: 0.2450307	total: 427ms	remaining: 16s
26:	learn: 0.2421214	total: 442ms	remaining: 15.9s
27:	learn: 0.2392237	total: 458ms	remaining: 15.9s
28:	learn: 0.2349750	total: 473ms	remaining: 15.8s
29:	learn: 0.2328794	total: 494ms	remaining: 16s
30:	learn: 0.2311974	total: 508ms	remaining: 15.9s
31:	learn: 0.2260436	total: 524ms	remaining: 15.9s
32:	learn: 0.2240655	total: 544ms	remaining: 15.9s
33:	learn: 0.2216084	total: 563ms	remaining: 16s
34:	learn: 0.2207379	total: 578ms	remaining: 15.9s
35:	learn: 0.2172258	total: 594ms	remaining: 15.9s
36:	learn: 0.2164987	total: 612ms	remaining: 15.9s
37:	learn: 0.2144765	total: 627ms	remaining: 15.9s
38:	learn: 0.2133807	total: 642ms	remaining: 15.8s
39:	learn: 0.2123588	total: 657ms	remaining: 15.8s
40:	learn: 0.2105070	total: 672ms	remaining: 15.7s
41:	learn: 0.2092128	total: 689ms	remaining: 15.7s
42:	learn: 0.2084130	total: 704ms	remaining: 15.7s
43:	learn: 0.2066955	total: 720ms	remaining: 15.6s
44:	learn: 0.2064136	total: 736ms	remaining: 15.6s
45:	learn: 0.2046001	total: 751ms	remaining: 15.6s
46:	learn: 0.2040757	total: 766ms	remaining: 15.5s
47:	learn: 0.2038196	total: 781ms	remaining: 15.5s
48:	learn: 0.2036771	total: 813ms	remaining: 15.8s
49:	learn: 0.2025008	total: 830ms	remaining: 15.8s
50:	learn: 0.2014387	total: 846ms	remaining: 15.7s
51:	learn: 0.2008805	total: 861ms	remaining: 15.7s
52:	learn: 0.1994608	total: 881ms	remaining: 15.7s
53:	learn: 0.1992036	total: 897ms	remaining: 15.7s
54:	learn: 0.1977200	total: 921ms	remaining: 15.8s
55:	learn: 0.1969381	total: 938ms	remaining: 15.8s
56:	learn: 0.1959099	total: 958ms	remaining: 15.8s
57:	learn: 0.1956010	total: 973ms	remaining: 15.8s
58:	learn: 0.1944269	total: 988ms	remaining: 15.8s
59:	learn: 0.1942344	total: 1s	remaining: 15.7s
60:	learn: 0.1941096	total: 1.02s	remaining: 15.7s
61:	learn: 0.1938395	total: 1.03s	remaining: 15.6s
62:	learn: 0.1931913	total: 1.05s	remaining: 15.7s
63:	learn: 0.1927421	total: 1.07s	remaining: 15.6s
64:	learn: 0.1922972	total: 1.09s	remaining: 15.6s
65:	learn: 0.1915407	total: 1.1s	remaining: 15.6s
66:	learn: 0.1905782	total: 1.12s	remaining: 15.6s
67:	learn: 0.1897217	total: 1.13s	remaining: 15.6s
68:	learn: 0.1889237	total: 1.15s	remaining: 15.5s
69:	learn: 0.1882965	total: 1.17s	remaining: 15.5s
70:	learn: 0.1882851	total: 1.18s	remaining: 15.4s
71:	learn: 0.1877101	total: 1.2s	remaining: 15.5s
72:	learn: 0.1875662	total: 1.22s	remaining: 15.4s
73:	learn: 0.1873778	total: 1.23s	remaining: 15.4s
74:	learn: 0.1866202	total: 1.25s	remaining: 15.4s
75:	learn: 0.1863534	total: 1.26s	remaining: 15.3s
76:	learn: 0.1861114	total: 1.28s	remaining: 15.3s
77:	learn: 0.1859665	total: 1.3s	remaining: 15.3s
78:	learn: 0.1853411	total: 1.33s	remaining: 15.5s
79:	learn: 0.1852200	total: 1.34s	remaining: 15.4s
80:	learn: 0.1851094	total: 1.37s	remaining: 15.5s
81:	learn: 0.1845033	total: 1.39s	remaining: 15.5s
82:	learn: 0.1843221	total: 1.41s	remaining: 15.6s
83:	learn: 0.1838259	total: 1.43s	remaining: 15.6s
84:	learn: 0.1835453	total: 1.45s	remaining: 15.6s
85:	learn: 0.1834403	total: 1.46s	remaining: 15.5s
86:	learn: 0.1832376	total: 1.48s	remaining: 15.5s
87:	learn: 0.1829176	total: 1.5s	remaining: 15.5s
88:	learn: 0.1822890	total: 1.51s	remaining: 15.5s
89:	learn: 0.1818193	total: 1.53s	remaining: 15.5s
90:	learn: 0.1816880	total: 1.55s	remaining: 15.4s
91:	learn: 0.1813974	total: 1.56s	remaining: 15.4s
92:	learn: 0.1813233	total: 1.58s	remaining: 15.4s
93:	learn: 0.1808696	total: 1.59s	remaining: 15.4s
94:	learn: 0.1805558	total: 1.61s	remaining: 15.4s
95:	learn: 0.1805142	total: 1.63s	remaining: 15.3s
96:	learn: 0.1803583	total: 1.65s	remaining: 15.4s
97:	learn: 0.1802759	total: 1.67s	remaining: 15.3s
98:	learn: 0.1799820	total: 1.69s	remaining: 15.4s
99:	learn: 0.1798659	total: 1.7s	remaining: 15.3s
100:	learn: 0.1796069	total: 1.72s	remaining: 15.3s
101:	learn: 0.1794597	total: 1.73s	remaining: 15.3s
102:	learn: 0.1791187	total: 1.75s	remaining: 15.2s
103:	learn: 0.1790486	total: 1.76s	remaining: 15.2s
104:	learn: 0.1784802	total: 1.8s	remaining: 15.3s
105:	learn: 0.1783744	total: 1.82s	remaining: 15.3s
106:	learn: 0.1780372	total: 1.83s	remaining: 15.3s
107:	learn: 0.1778955	total: 1.85s	remaining: 15.3s
108:	learn: 0.1776127	total: 1.88s	remaining: 15.3s
109:	learn: 0.1775259	total: 1.92s	remaining: 15.5s
110:	learn: 0.1773663	total: 1.95s	remaining: 15.6s
111:	learn: 0.1772873	total: 1.97s	remaining: 15.6s
112:	learn: 0.1772101	total: 2s	remaining: 15.7s
113:	learn: 0.1769929	total: 2.02s	remaining: 15.7s
114:	learn: 0.1768556	total: 2.06s	remaining: 15.8s
115:	learn: 0.1764808	total: 2.08s	remaining: 15.9s
116:	learn: 0.1763954	total: 2.12s	remaining: 16s
117:	learn: 0.1760720	total: 2.15s	remaining: 16.1s
118:	learn: 0.1760173	total: 2.17s	remaining: 16.1s
119:	learn: 0.1758337	total: 2.21s	remaining: 16.2s
120:	learn: 0.1757827	total: 2.23s	remaining: 16.2s
121:	learn: 0.1756935	total: 2.27s	remaining: 16.4s
122:	learn: 0.1754716	total: 2.29s	remaining: 16.3s
123:	learn: 0.1754234	total: 2.32s	remaining: 16.4s
124:	learn: 0.1753148	total: 2.34s	remaining: 16.4s
125:	learn: 0.1752707	total: 2.38s	remaining: 16.5s
126:	learn: 0.1748897	total: 2.4s	remaining: 16.5s
127:	learn: 0.1747327	total: 2.43s	remaining: 16.6s
128:	learn: 0.1747021	total: 2.45s	remaining: 16.5s
129:	learn: 0.1745293	total: 2.48s	remaining: 16.6s
130:	learn: 0.1742356	total: 2.5s	remaining: 16.6s
131:	learn: 0.1742087	total: 2.52s	remaining: 16.6s
132:	learn: 0.1741029	total: 2.54s	remaining: 16.5s
133:	learn: 0.1740143	total: 2.55s	remaining: 16.5s
134:	learn: 0.1737055	total: 2.58s	remaining: 16.5s
135:	learn: 0.1736608	total: 2.61s	remaining: 16.6s
136:	learn: 0.1735650	total: 2.63s	remaining: 16.6s
137:	learn: 0.1734099	total: 2.65s	remaining: 16.5s
138:	learn: 0.1732461	total: 2.67s	remaining: 16.5s
139:	learn: 0.1731944	total: 2.7s	remaining: 16.6s
140:	learn: 0.1728978	total: 2.72s	remaining: 16.6s
141:	learn: 0.1727385	total: 2.74s	remaining: 16.6s
142:	learn: 0.1725253	total: 2.76s	remaining: 16.5s
143:	learn: 0.1725127	total: 2.77s	remaining: 16.5s
144:	learn: 0.1723662	total: 2.79s	remaining: 16.4s
145:	learn: 0.1722766	total: 2.8s	remaining: 16.4s
146:	learn: 0.1722314	total: 2.82s	remaining: 16.4s
147:	learn: 0.1721372	total: 2.84s	remaining: 16.3s
148:	learn: 0.1721049	total: 2.85s	remaining: 16.3s
149:	learn: 0.1720607	total: 2.87s	remaining: 16.3s
150:	learn: 0.1720372	total: 2.88s	remaining: 16.2s
151:	learn: 0.1717560	total: 2.9s	remaining: 16.2s
152:	learn: 0.1716419	total: 2.92s	remaining: 16.2s
153:	learn: 0.1715796	total: 2.94s	remaining: 16.1s
154:	learn: 0.1712452	total: 2.95s	remaining: 16.1s
155:	learn: 0.1711057	total: 2.97s	remaining: 16.1s
156:	learn: 0.1710331	total: 2.98s	remaining: 16s
157:	learn: 0.1709830	total: 3s	remaining: 16s
158:	learn: 0.1706745	total: 3.02s	remaining: 16s
159:	learn: 0.1705731	total: 3.03s	remaining: 15.9s
160:	learn: 0.1705433	total: 3.05s	remaining: 15.9s
161:	learn: 0.1704953	total: 3.07s	remaining: 15.9s
162:	learn: 0.1704084	total: 3.08s	remaining: 15.8s
163:	learn: 0.1703531	total: 3.1s	remaining: 15.8s
164:	learn: 0.1702894	total: 3.11s	remaining: 15.8s
165:	learn: 0.1701821	total: 3.13s	remaining: 15.7s
166:	learn: 0.1701334	total: 3.15s	remaining: 15.7s
167:	learn: 0.1701054	total: 3.16s	remaining: 15.7s
168:	learn: 0.1700784	total: 3.18s	remaining: 15.6s
169:	learn: 0.1699751	total: 3.2s	remaining: 15.6s
170:	learn: 0.1699293	total: 3.21s	remaining: 15.6s
171:	learn: 0.1698261	total: 3.23s	remaining: 15.5s
172:	learn: 0.1697955	total: 3.24s	remaining: 15.5s
173:	learn: 0.1697688	total: 3.26s	remaining: 15.5s
174:	learn: 0.1696696	total: 3.27s	remaining: 15.4s
175:	learn: 0.1695631	total: 3.29s	remaining: 15.4s
176:	learn: 0.1694805	total: 3.31s	remaining: 15.4s
177:	learn: 0.1694035	total: 3.33s	remaining: 15.4s
178:	learn: 0.1693701	total: 3.35s	remaining: 15.3s
179:	learn: 0.1692160	total: 3.36s	remaining: 15.3s
180:	learn: 0.1690605	total: 3.38s	remaining: 15.3s
181:	learn: 0.1690104	total: 3.39s	remaining: 15.3s
182:	learn: 0.1689410	total: 3.42s	remaining: 15.3s
183:	learn: 0.1689208	total: 3.43s	remaining: 15.2s
184:	learn: 0.1688657	total: 3.45s	remaining: 15.2s
185:	learn: 0.1688187	total: 3.46s	remaining: 15.2s
186:	learn: 0.1687865	total: 3.48s	remaining: 15.1s
187:	learn: 0.1687316	total: 3.5s	remaining: 15.1s
188:	learn: 0.1686861	total: 3.52s	remaining: 15.1s
189:	learn: 0.1686226	total: 3.54s	remaining: 15.1s
190:	learn: 0.1685020	total: 3.57s	remaining: 15.1s
191:	learn: 0.1684019	total: 3.59s	remaining: 15.1s
192:	learn: 0.1683712	total: 3.61s	remaining: 15.1s
193:	learn: 0.1683472	total: 3.63s	remaining: 15.1s
194:	learn: 0.1682351	total: 3.65s	remaining: 15.1s
195:	learn: 0.1681619	total: 3.67s	remaining: 15s
196:	learn: 0.1681157	total: 3.68s	remaining: 15s
197:	learn: 0.1680113	total: 3.7s	remaining: 15s
198:	learn: 0.1679519	total: 3.72s	remaining: 15s
199:	learn: 0.1678763	total: 3.74s	remaining: 15s
200:	learn: 0.1678282	total: 3.75s	remaining: 14.9s
201:	learn: 0.1677877	total: 3.77s	remaining: 14.9s
202:	learn: 0.1677401	total: 3.79s	remaining: 14.9s
203:	learn: 0.1676914	total: 3.8s	remaining: 14.8s
204:	learn: 0.1676493	total: 3.83s	remaining: 14.8s
205:	learn: 0.1675769	total: 3.85s	remaining: 14.8s
206:	learn: 0.1675417	total: 3.88s	remaining: 14.8s
207:	learn: 0.1674894	total: 3.89s	remaining: 14.8s
208:	learn: 0.1674117	total: 3.92s	remaining: 14.8s
209:	learn: 0.1673700	total: 3.94s	remaining: 14.8s
210:	learn: 0.1673285	total: 3.96s	remaining: 14.8s
211:	learn: 0.1672577	total: 3.98s	remaining: 14.8s
212:	learn: 0.1672357	total: 4.02s	remaining: 14.8s
213:	learn: 0.1671976	total: 4.04s	remaining: 14.8s
214:	learn: 0.1671410	total: 4.07s	remaining: 14.9s
215:	learn: 0.1669136	total: 4.09s	remaining: 14.8s
216:	learn: 0.1668859	total: 4.12s	remaining: 14.9s
217:	learn: 0.1667058	total: 4.14s	remaining: 14.9s
218:	learn: 0.1666495	total: 4.17s	remaining: 14.9s
219:	learn: 0.1665891	total: 4.19s	remaining: 14.8s
220:	learn: 0.1664762	total: 4.21s	remaining: 14.8s
221:	learn: 0.1664298	total: 4.23s	remaining: 14.8s
222:	learn: 0.1663634	total: 4.25s	remaining: 14.8s
223:	learn: 0.1663369	total: 4.27s	remaining: 14.8s
224:	learn: 0.1663001	total: 4.29s	remaining: 14.8s
225:	learn: 0.1662680	total: 4.31s	remaining: 14.8s
226:	learn: 0.1662301	total: 4.33s	remaining: 14.8s
227:	learn: 0.1661663	total: 4.35s	remaining: 14.7s
228:	learn: 0.1661442	total: 4.37s	remaining: 14.7s
229:	learn: 0.1660627	total: 4.38s	remaining: 14.7s
230:	learn: 0.1659525	total: 4.4s	remaining: 14.7s
231:	learn: 0.1659258	total: 4.42s	remaining: 14.6s
232:	learn: 0.1658606	total: 4.43s	remaining: 14.6s
233:	learn: 0.1658071	total: 4.45s	remaining: 14.6s
234:	learn: 0.1657659	total: 4.47s	remaining: 14.6s
235:	learn: 0.1657341	total: 4.49s	remaining: 14.5s
236:	learn: 0.1656613	total: 4.5s	remaining: 14.5s
237:	learn: 0.1656200	total: 4.52s	remaining: 14.5s
238:	learn: 0.1655910	total: 4.55s	remaining: 14.5s
239:	learn: 0.1655439	total: 4.57s	remaining: 14.5s
240:	learn: 0.1654858	total: 4.59s	remaining: 14.4s
241:	learn: 0.1653856	total: 4.6s	remaining: 14.4s
242:	learn: 0.1653492	total: 4.62s	remaining: 14.4s
243:	learn: 0.1653195	total: 4.64s	remaining: 14.4s
244:	learn: 0.1652795	total: 4.66s	remaining: 14.4s
245:	learn: 0.1652160	total: 4.67s	remaining: 14.3s
246:	learn: 0.1651734	total: 4.7s	remaining: 14.3s
247:	learn: 0.1651143	total: 4.71s	remaining: 14.3s
248:	learn: 0.1650848	total: 4.73s	remaining: 14.3s
249:	learn: 0.1650491	total: 4.75s	remaining: 14.3s
250:	learn: 0.1649944	total: 4.77s	remaining: 14.2s
251:	learn: 0.1649575	total: 4.78s	remaining: 14.2s
252:	learn: 0.1648961	total: 4.8s	remaining: 14.2s
253:	learn: 0.1648578	total: 4.82s	remaining: 14.1s
254:	learn: 0.1648281	total: 4.83s	remaining: 14.1s
255:	learn: 0.1647307	total: 4.85s	remaining: 14.1s
256:	learn: 0.1646678	total: 4.86s	remaining: 14.1s
257:	learn: 0.1646398	total: 4.88s	remaining: 14s
258:	learn: 0.1645696	total: 4.9s	remaining: 14s
259:	learn: 0.1645338	total: 4.91s	remaining: 14s
260:	learn: 0.1645052	total: 4.93s	remaining: 14s
261:	learn: 0.1644721	total: 4.95s	remaining: 13.9s
262:	learn: 0.1644378	total: 4.97s	remaining: 13.9s
263:	learn: 0.1643898	total: 4.99s	remaining: 13.9s
264:	learn: 0.1643596	total: 5.01s	remaining: 13.9s
265:	learn: 0.1643287	total: 5.03s	remaining: 13.9s
266:	learn: 0.1642873	total: 5.05s	remaining: 13.9s
267:	learn: 0.1642460	total: 5.07s	remaining: 13.8s
268:	learn: 0.1642172	total: 5.1s	remaining: 13.9s
269:	learn: 0.1641671	total: 5.13s	remaining: 13.9s
270:	learn: 0.1641451	total: 5.15s	remaining: 13.9s
271:	learn: 0.1640500	total: 5.17s	remaining: 13.9s
272:	learn: 0.1640085	total: 5.21s	remaining: 13.9s
273:	learn: 0.1639929	total: 5.22s	remaining: 13.8s
274:	learn: 0.1639638	total: 5.24s	remaining: 13.8s
275:	learn: 0.1639059	total: 5.26s	remaining: 13.8s
276:	learn: 0.1638682	total: 5.28s	remaining: 13.8s
277:	learn: 0.1638256	total: 5.29s	remaining: 13.8s
278:	learn: 0.1637928	total: 5.31s	remaining: 13.7s
279:	learn: 0.1637497	total: 5.33s	remaining: 13.7s
280:	learn: 0.1637272	total: 5.36s	remaining: 13.7s
281:	learn: 0.1637078	total: 5.38s	remaining: 13.7s
282:	learn: 0.1636479	total: 5.4s	remaining: 13.7s
283:	learn: 0.1636205	total: 5.42s	remaining: 13.7s
284:	learn: 0.1635805	total: 5.43s	remaining: 13.6s
285:	learn: 0.1635486	total: 5.45s	remaining: 13.6s
286:	learn: 0.1635162	total: 5.47s	remaining: 13.6s
287:	learn: 0.1634918	total: 5.48s	remaining: 13.6s
288:	learn: 0.1634236	total: 5.5s	remaining: 13.5s
289:	learn: 0.1633995	total: 5.52s	remaining: 13.5s
290:	learn: 0.1633459	total: 5.54s	remaining: 13.5s
291:	learn: 0.1633188	total: 5.55s	remaining: 13.5s
292:	learn: 0.1632730	total: 5.58s	remaining: 13.5s
293:	learn: 0.1632554	total: 5.59s	remaining: 13.4s
294:	learn: 0.1632300	total: 5.61s	remaining: 13.4s
295:	learn: 0.1631507	total: 5.62s	remaining: 13.4s
296:	learn: 0.1631245	total: 5.64s	remaining: 13.4s
297:	learn: 0.1631088	total: 5.66s	remaining: 13.3s
298:	learn: 0.1630630	total: 5.68s	remaining: 13.3s
299:	learn: 0.1630233	total: 5.69s	remaining: 13.3s
300:	learn: 0.1629976	total: 5.71s	remaining: 13.3s
301:	learn: 0.1629694	total: 5.73s	remaining: 13.2s
302:	learn: 0.1629227	total: 5.75s	remaining: 13.2s
303:	learn: 0.1628973	total: 5.78s	remaining: 13.2s
304:	learn: 0.1628725	total: 5.8s	remaining: 13.2s
305:	learn: 0.1628475	total: 5.82s	remaining: 13.2s
306:	learn: 0.1628159	total: 5.84s	remaining: 13.2s
307:	learn: 0.1628015	total: 5.85s	remaining: 13.2s
308:	learn: 0.1627806	total: 5.88s	remaining: 13.1s
309:	learn: 0.1627534	total: 5.89s	remaining: 13.1s
310:	learn: 0.1627238	total: 5.92s	remaining: 13.1s
311:	learn: 0.1626669	total: 5.94s	remaining: 13.1s
312:	learn: 0.1626440	total: 5.96s	remaining: 13.1s
313:	learn: 0.1626133	total: 5.98s	remaining: 13.1s
314:	learn: 0.1625662	total: 6.01s	remaining: 13.1s
315:	learn: 0.1625453	total: 6.03s	remaining: 13s
316:	learn: 0.1625234	total: 6.06s	remaining: 13.1s
317:	learn: 0.1625016	total: 6.08s	remaining: 13s
318:	learn: 0.1624600	total: 6.1s	remaining: 13s
319:	learn: 0.1624264	total: 6.12s	remaining: 13s
320:	learn: 0.1623889	total: 6.14s	remaining: 13s
321:	learn: 0.1623748	total: 6.15s	remaining: 13s
322:	learn: 0.1623205	total: 6.17s	remaining: 12.9s
323:	learn: 0.1622884	total: 6.19s	remaining: 12.9s
324:	learn: 0.1621508	total: 6.2s	remaining: 12.9s
325:	learn: 0.1620369	total: 6.22s	remaining: 12.9s
326:	learn: 0.1619408	total: 6.24s	remaining: 12.8s
327:	learn: 0.1619190	total: 6.26s	remaining: 12.8s
328:	learn: 0.1618378	total: 6.28s	remaining: 12.8s
329:	learn: 0.1618052	total: 6.29s	remaining: 12.8s
330:	learn: 0.1617668	total: 6.31s	remaining: 12.8s
331:	learn: 0.1617356	total: 6.33s	remaining: 12.7s
332:	learn: 0.1616852	total: 6.35s	remaining: 12.7s
333:	learn: 0.1616512	total: 6.36s	remaining: 12.7s
334:	learn: 0.1616291	total: 6.38s	remaining: 12.7s
335:	learn: 0.1615910	total: 6.4s	remaining: 12.6s
336:	learn: 0.1615793	total: 6.41s	remaining: 12.6s
337:	learn: 0.1615534	total: 6.43s	remaining: 12.6s
338:	learn: 0.1615337	total: 6.45s	remaining: 12.6s
339:	learn: 0.1614881	total: 6.46s	remaining: 12.5s
340:	learn: 0.1614635	total: 6.48s	remaining: 12.5s
341:	learn: 0.1614457	total: 6.49s	remaining: 12.5s
342:	learn: 0.1614280	total: 6.51s	remaining: 12.5s
343:	learn: 0.1613784	total: 6.52s	remaining: 12.4s
344:	learn: 0.1613582	total: 6.54s	remaining: 12.4s
345:	learn: 0.1613105	total: 6.55s	remaining: 12.4s
346:	learn: 0.1612828	total: 6.57s	remaining: 12.4s
347:	learn: 0.1612643	total: 6.59s	remaining: 12.3s
348:	learn: 0.1612117	total: 6.6s	remaining: 12.3s
349:	learn: 0.1611940	total: 6.62s	remaining: 12.3s
350:	learn: 0.1611695	total: 6.64s	remaining: 12.3s
351:	learn: 0.1611340	total: 6.65s	remaining: 12.2s
352:	learn: 0.1611076	total: 6.67s	remaining: 12.2s
353:	learn: 0.1610772	total: 6.68s	remaining: 12.2s
354:	learn: 0.1610416	total: 6.7s	remaining: 12.2s
355:	learn: 0.1610288	total: 6.72s	remaining: 12.2s
356:	learn: 0.1610015	total: 6.73s	remaining: 12.1s
357:	learn: 0.1609766	total: 6.75s	remaining: 12.1s
358:	learn: 0.1609431	total: 6.77s	remaining: 12.1s
359:	learn: 0.1609155	total: 6.78s	remaining: 12.1s
360:	learn: 0.1608839	total: 6.8s	remaining: 12s
361:	learn: 0.1608635	total: 6.81s	remaining: 12s
362:	learn: 0.1608366	total: 6.83s	remaining: 12s
363:	learn: 0.1608169	total: 6.84s	remaining: 12s
364:	learn: 0.1607944	total: 6.86s	remaining: 11.9s
365:	learn: 0.1607730	total: 6.88s	remaining: 11.9s
366:	learn: 0.1607485	total: 6.9s	remaining: 11.9s
367:	learn: 0.1607134	total: 6.92s	remaining: 11.9s
368:	learn: 0.1606791	total: 6.93s	remaining: 11.9s
369:	learn: 0.1606466	total: 6.95s	remaining: 11.8s
370:	learn: 0.1606304	total: 6.96s	remaining: 11.8s
371:	learn: 0.1605974	total: 6.98s	remaining: 11.8s
372:	learn: 0.1605762	total: 7s	remaining: 11.8s
373:	learn: 0.1605493	total: 7.01s	remaining: 11.7s
374:	learn: 0.1605176	total: 7.03s	remaining: 11.7s
375:	learn: 0.1604896	total: 7.05s	remaining: 11.7s
376:	learn: 0.1604507	total: 7.07s	remaining: 11.7s
377:	learn: 0.1604281	total: 7.09s	remaining: 11.7s
378:	learn: 0.1603840	total: 7.1s	remaining: 11.6s
379:	learn: 0.1603490	total: 7.12s	remaining: 11.6s
380:	learn: 0.1603375	total: 7.14s	remaining: 11.6s
381:	learn: 0.1602832	total: 7.15s	remaining: 11.6s
382:	learn: 0.1602507	total: 7.17s	remaining: 11.6s
383:	learn: 0.1602158	total: 7.19s	remaining: 11.5s
384:	learn: 0.1601987	total: 7.21s	remaining: 11.5s
385:	learn: 0.1601796	total: 7.23s	remaining: 11.5s
386:	learn: 0.1601411	total: 7.24s	remaining: 11.5s
387:	learn: 0.1601147	total: 7.26s	remaining: 11.5s
388:	learn: 0.1600759	total: 7.28s	remaining: 11.4s
389:	learn: 0.1600471	total: 7.29s	remaining: 11.4s
390:	learn: 0.1600127	total: 7.32s	remaining: 11.4s
391:	learn: 0.1599812	total: 7.34s	remaining: 11.4s
392:	learn: 0.1599432	total: 7.36s	remaining: 11.4s
393:	learn: 0.1599220	total: 7.38s	remaining: 11.3s
394:	learn: 0.1598901	total: 7.4s	remaining: 11.3s
395:	learn: 0.1598713	total: 7.42s	remaining: 11.3s
396:	learn: 0.1598310	total: 7.44s	remaining: 11.3s
397:	learn: 0.1598005	total: 7.46s	remaining: 11.3s
398:	learn: 0.1597814	total: 7.48s	remaining: 11.3s
399:	learn: 0.1597473	total: 7.5s	remaining: 11.2s
400:	learn: 0.1597231	total: 7.52s	remaining: 11.2s
401:	learn: 0.1596886	total: 7.54s	remaining: 11.2s
402:	learn: 0.1596589	total: 7.57s	remaining: 11.2s
403:	learn: 0.1596388	total: 7.59s	remaining: 11.2s
404:	learn: 0.1596142	total: 7.61s	remaining: 11.2s
405:	learn: 0.1595793	total: 7.63s	remaining: 11.2s
406:	learn: 0.1595527	total: 7.64s	remaining: 11.1s
407:	learn: 0.1595325	total: 7.66s	remaining: 11.1s
408:	learn: 0.1595021	total: 7.69s	remaining: 11.1s
409:	learn: 0.1594786	total: 7.7s	remaining: 11.1s
410:	learn: 0.1594542	total: 7.72s	remaining: 11.1s
411:	learn: 0.1594378	total: 7.74s	remaining: 11s
412:	learn: 0.1594172	total: 7.77s	remaining: 11s
413:	learn: 0.1593960	total: 7.79s	remaining: 11s
414:	learn: 0.1593817	total: 7.8s	remaining: 11s
415:	learn: 0.1593563	total: 7.82s	remaining: 11s
416:	learn: 0.1593194	total: 7.85s	remaining: 11s
417:	learn: 0.1592935	total: 7.86s	remaining: 10.9s
418:	learn: 0.1592704	total: 7.88s	remaining: 10.9s
419:	learn: 0.1592505	total: 7.9s	remaining: 10.9s
420:	learn: 0.1591966	total: 7.92s	remaining: 10.9s
421:	learn: 0.1591464	total: 7.94s	remaining: 10.9s
422:	learn: 0.1591155	total: 7.96s	remaining: 10.9s
423:	learn: 0.1590884	total: 7.98s	remaining: 10.8s
424:	learn: 0.1590689	total: 8s	remaining: 10.8s
425:	learn: 0.1590413	total: 8.02s	remaining: 10.8s
426:	learn: 0.1590144	total: 8.04s	remaining: 10.8s
427:	learn: 0.1589785	total: 8.06s	remaining: 10.8s
428:	learn: 0.1589508	total: 8.07s	remaining: 10.7s
429:	learn: 0.1589216	total: 8.1s	remaining: 10.7s
430:	learn: 0.1589044	total: 8.12s	remaining: 10.7s
431:	learn: 0.1588816	total: 8.13s	remaining: 10.7s
432:	learn: 0.1588612	total: 8.15s	remaining: 10.7s
433:	learn: 0.1588528	total: 8.17s	remaining: 10.7s
434:	learn: 0.1588245	total: 8.19s	remaining: 10.6s
435:	learn: 0.1587932	total: 8.21s	remaining: 10.6s
436:	learn: 0.1587649	total: 8.23s	remaining: 10.6s
437:	learn: 0.1587437	total: 8.25s	remaining: 10.6s
438:	learn: 0.1587202	total: 8.27s	remaining: 10.6s
439:	learn: 0.1586995	total: 8.29s	remaining: 10.5s
440:	learn: 0.1586868	total: 8.31s	remaining: 10.5s
441:	learn: 0.1585756	total: 8.32s	remaining: 10.5s
442:	learn: 0.1585468	total: 8.34s	remaining: 10.5s
443:	learn: 0.1585206	total: 8.36s	remaining: 10.5s
444:	learn: 0.1584971	total: 8.37s	remaining: 10.4s
445:	learn: 0.1584757	total: 8.39s	remaining: 10.4s
446:	learn: 0.1584554	total: 8.41s	remaining: 10.4s
447:	learn: 0.1584436	total: 8.42s	remaining: 10.4s
448:	learn: 0.1583655	total: 8.44s	remaining: 10.4s
449:	learn: 0.1583322	total: 8.46s	remaining: 10.3s
450:	learn: 0.1583031	total: 8.48s	remaining: 10.3s
451:	learn: 0.1582845	total: 8.5s	remaining: 10.3s
452:	learn: 0.1582520	total: 8.52s	remaining: 10.3s
453:	learn: 0.1582040	total: 8.54s	remaining: 10.3s
454:	learn: 0.1581785	total: 8.55s	remaining: 10.2s
455:	learn: 0.1581522	total: 8.57s	remaining: 10.2s
456:	learn: 0.1581206	total: 8.59s	remaining: 10.2s
457:	learn: 0.1581011	total: 8.6s	remaining: 10.2s
458:	learn: 0.1580690	total: 8.63s	remaining: 10.2s
459:	learn: 0.1580561	total: 8.65s	remaining: 10.2s
460:	learn: 0.1580397	total: 8.67s	remaining: 10.1s
461:	learn: 0.1580211	total: 8.69s	remaining: 10.1s
462:	learn: 0.1579887	total: 8.71s	remaining: 10.1s
463:	learn: 0.1579646	total: 8.72s	remaining: 10.1s
464:	learn: 0.1579365	total: 8.74s	remaining: 10.1s
465:	learn: 0.1579171	total: 8.76s	remaining: 10s
466:	learn: 0.1578911	total: 8.78s	remaining: 10s
467:	learn: 0.1578609	total: 8.79s	remaining: 9.99s
468:	learn: 0.1578389	total: 8.81s	remaining: 9.97s
469:	learn: 0.1578254	total: 8.83s	remaining: 9.95s
470:	learn: 0.1577948	total: 8.85s	remaining: 9.94s
471:	learn: 0.1577730	total: 8.87s	remaining: 9.92s
472:	learn: 0.1577547	total: 8.88s	remaining: 9.89s
473:	learn: 0.1577450	total: 8.9s	remaining: 9.87s
474:	learn: 0.1577214	total: 8.92s	remaining: 9.86s
475:	learn: 0.1576793	total: 8.94s	remaining: 9.84s
476:	learn: 0.1576588	total: 8.95s	remaining: 9.81s
477:	learn: 0.1576363	total: 8.97s	remaining: 9.79s
478:	learn: 0.1576189	total: 8.98s	remaining: 9.77s
479:	learn: 0.1575957	total: 9s	remaining: 9.75s
480:	learn: 0.1575772	total: 9.02s	remaining: 9.73s
481:	learn: 0.1575600	total: 9.03s	remaining: 9.71s
482:	learn: 0.1575354	total: 9.05s	remaining: 9.69s
483:	learn: 0.1575165	total: 9.06s	remaining: 9.66s
484:	learn: 0.1574899	total: 9.09s	remaining: 9.65s
485:	learn: 0.1574699	total: 9.1s	remaining: 9.63s
486:	learn: 0.1574569	total: 9.12s	remaining: 9.61s
487:	learn: 0.1574317	total: 9.14s	remaining: 9.59s
488:	learn: 0.1574006	total: 9.16s	remaining: 9.57s
489:	learn: 0.1573764	total: 9.17s	remaining: 9.55s
490:	learn: 0.1573410	total: 9.2s	remaining: 9.53s
491:	learn: 0.1573245	total: 9.23s	remaining: 9.53s
492:	learn: 0.1572920	total: 9.25s	remaining: 9.51s
493:	learn: 0.1572252	total: 9.26s	remaining: 9.49s
494:	learn: 0.1571930	total: 9.28s	remaining: 9.47s
495:	learn: 0.1571673	total: 9.3s	remaining: 9.45s
496:	learn: 0.1571481	total: 9.31s	remaining: 9.43s
497:	learn: 0.1571297	total: 9.33s	remaining: 9.4s
498:	learn: 0.1571136	total: 9.35s	remaining: 9.38s
499:	learn: 0.1570889	total: 9.36s	remaining: 9.36s
500:	learn: 0.1570777	total: 9.38s	remaining: 9.34s
501:	learn: 0.1570718	total: 9.4s	remaining: 9.32s
502:	learn: 0.1570586	total: 9.41s	remaining: 9.3s
503:	learn: 0.1569998	total: 9.43s	remaining: 9.28s
504:	learn: 0.1569856	total: 9.44s	remaining: 9.26s
505:	learn: 0.1569591	total: 9.46s	remaining: 9.23s
506:	learn: 0.1569207	total: 9.47s	remaining: 9.21s
507:	learn: 0.1568935	total: 9.49s	remaining: 9.19s
508:	learn: 0.1568774	total: 9.51s	remaining: 9.17s
509:	learn: 0.1568676	total: 9.52s	remaining: 9.15s
510:	learn: 0.1568525	total: 9.54s	remaining: 9.13s
511:	learn: 0.1568319	total: 9.56s	remaining: 9.11s
512:	learn: 0.1568108	total: 9.58s	remaining: 9.09s
513:	learn: 0.1567996	total: 9.6s	remaining: 9.08s
514:	learn: 0.1567709	total: 9.63s	remaining: 9.07s
515:	learn: 0.1567542	total: 9.65s	remaining: 9.05s
516:	learn: 0.1567205	total: 9.67s	remaining: 9.03s
517:	learn: 0.1566742	total: 9.69s	remaining: 9.01s
518:	learn: 0.1566545	total: 9.7s	remaining: 8.99s
519:	learn: 0.1566373	total: 9.71s	remaining: 8.97s
520:	learn: 0.1566183	total: 9.73s	remaining: 8.95s
521:	learn: 0.1565925	total: 9.75s	remaining: 8.93s
522:	learn: 0.1565682	total: 9.78s	remaining: 8.92s
523:	learn: 0.1565312	total: 9.79s	remaining: 8.89s
524:	learn: 0.1565080	total: 9.81s	remaining: 8.87s
525:	learn: 0.1564878	total: 9.82s	remaining: 8.85s
526:	learn: 0.1564506	total: 9.84s	remaining: 8.83s
527:	learn: 0.1564296	total: 9.85s	remaining: 8.81s
528:	learn: 0.1564108	total: 9.87s	remaining: 8.79s
529:	learn: 0.1563956	total: 9.89s	remaining: 8.77s
530:	learn: 0.1563755	total: 9.9s	remaining: 8.75s
531:	learn: 0.1563400	total: 9.92s	remaining: 8.73s
532:	learn: 0.1563030	total: 9.95s	remaining: 8.72s
533:	learn: 0.1562677	total: 9.96s	remaining: 8.7s
534:	learn: 0.1562573	total: 9.98s	remaining: 8.68s
535:	learn: 0.1562329	total: 10s	remaining: 8.65s
536:	learn: 0.1562059	total: 10s	remaining: 8.63s
537:	learn: 0.1561974	total: 10s	remaining: 8.61s
538:	learn: 0.1561514	total: 10s	remaining: 8.59s
539:	learn: 0.1561356	total: 10.1s	remaining: 8.57s
540:	learn: 0.1561110	total: 10.1s	remaining: 8.55s
541:	learn: 0.1560693	total: 10.1s	remaining: 8.53s
542:	learn: 0.1560530	total: 10.1s	remaining: 8.52s
543:	learn: 0.1560266	total: 10.1s	remaining: 8.5s
544:	learn: 0.1560170	total: 10.2s	remaining: 8.48s
545:	learn: 0.1560051	total: 10.2s	remaining: 8.46s
546:	learn: 0.1559836	total: 10.2s	remaining: 8.45s
547:	learn: 0.1559568	total: 10.2s	remaining: 8.43s
548:	learn: 0.1559385	total: 10.2s	remaining: 8.41s
549:	learn: 0.1559286	total: 10.3s	remaining: 8.39s
550:	learn: 0.1559038	total: 10.3s	remaining: 8.37s
551:	learn: 0.1558695	total: 10.3s	remaining: 8.35s
552:	learn: 0.1558413	total: 10.3s	remaining: 8.35s
553:	learn: 0.1558168	total: 10.3s	remaining: 8.33s
554:	learn: 0.1558003	total: 10.4s	remaining: 8.31s
555:	learn: 0.1557838	total: 10.4s	remaining: 8.29s
556:	learn: 0.1557616	total: 10.4s	remaining: 8.27s
557:	learn: 0.1557114	total: 10.4s	remaining: 8.25s
558:	learn: 0.1556822	total: 10.4s	remaining: 8.23s
559:	learn: 0.1556499	total: 10.4s	remaining: 8.21s
560:	learn: 0.1556309	total: 10.5s	remaining: 8.18s
561:	learn: 0.1556129	total: 10.5s	remaining: 8.17s
562:	learn: 0.1556022	total: 10.5s	remaining: 8.15s
563:	learn: 0.1555932	total: 10.5s	remaining: 8.13s
564:	learn: 0.1555388	total: 10.5s	remaining: 8.11s
565:	learn: 0.1554754	total: 10.6s	remaining: 8.09s
566:	learn: 0.1554337	total: 10.6s	remaining: 8.07s
567:	learn: 0.1554082	total: 10.6s	remaining: 8.06s
568:	learn: 0.1553709	total: 10.6s	remaining: 8.04s
569:	learn: 0.1553569	total: 10.6s	remaining: 8.02s
570:	learn: 0.1553464	total: 10.6s	remaining: 7.99s
571:	learn: 0.1553214	total: 10.7s	remaining: 7.98s
572:	learn: 0.1553017	total: 10.7s	remaining: 7.96s
573:	learn: 0.1552908	total: 10.7s	remaining: 7.94s
574:	learn: 0.1552696	total: 10.7s	remaining: 7.92s
575:	learn: 0.1552560	total: 10.7s	remaining: 7.9s
576:	learn: 0.1552397	total: 10.7s	remaining: 7.88s
577:	learn: 0.1552072	total: 10.8s	remaining: 7.86s
578:	learn: 0.1551902	total: 10.8s	remaining: 7.84s
579:	learn: 0.1551617	total: 10.8s	remaining: 7.82s
580:	learn: 0.1551469	total: 10.8s	remaining: 7.8s
581:	learn: 0.1551296	total: 10.8s	remaining: 7.79s
582:	learn: 0.1551151	total: 10.9s	remaining: 7.76s
583:	learn: 0.1550978	total: 10.9s	remaining: 7.75s
584:	learn: 0.1550661	total: 10.9s	remaining: 7.73s
585:	learn: 0.1550530	total: 10.9s	remaining: 7.71s
586:	learn: 0.1550462	total: 10.9s	remaining: 7.68s
587:	learn: 0.1550170	total: 10.9s	remaining: 7.67s
588:	learn: 0.1550002	total: 11s	remaining: 7.65s
589:	learn: 0.1549684	total: 11s	remaining: 7.63s
590:	learn: 0.1549589	total: 11s	remaining: 7.61s
591:	learn: 0.1549419	total: 11s	remaining: 7.59s
592:	learn: 0.1549235	total: 11s	remaining: 7.57s
593:	learn: 0.1549076	total: 11.1s	remaining: 7.56s
594:	learn: 0.1548820	total: 11.1s	remaining: 7.54s
595:	learn: 0.1548688	total: 11.1s	remaining: 7.52s
596:	learn: 0.1548361	total: 11.1s	remaining: 7.5s
597:	learn: 0.1548126	total: 11.1s	remaining: 7.48s
598:	learn: 0.1547901	total: 11.1s	remaining: 7.46s
599:	learn: 0.1547610	total: 11.2s	remaining: 7.44s
600:	learn: 0.1547454	total: 11.2s	remaining: 7.42s
601:	learn: 0.1547361	total: 11.2s	remaining: 7.4s
602:	learn: 0.1547106	total: 11.2s	remaining: 7.38s
603:	learn: 0.1546801	total: 11.2s	remaining: 7.37s
604:	learn: 0.1546684	total: 11.2s	remaining: 7.34s
605:	learn: 0.1546470	total: 11.3s	remaining: 7.33s
606:	learn: 0.1546368	total: 11.3s	remaining: 7.31s
607:	learn: 0.1546103	total: 11.3s	remaining: 7.29s
608:	learn: 0.1545976	total: 11.3s	remaining: 7.27s
609:	learn: 0.1545734	total: 11.3s	remaining: 7.25s
610:	learn: 0.1545387	total: 11.4s	remaining: 7.23s
611:	learn: 0.1545119	total: 11.4s	remaining: 7.21s
612:	learn: 0.1544951	total: 11.4s	remaining: 7.18s
613:	learn: 0.1544739	total: 11.4s	remaining: 7.17s
614:	learn: 0.1544523	total: 11.4s	remaining: 7.15s
615:	learn: 0.1544370	total: 11.4s	remaining: 7.13s
616:	learn: 0.1544286	total: 11.5s	remaining: 7.11s
617:	learn: 0.1544086	total: 11.5s	remaining: 7.09s
618:	learn: 0.1543867	total: 11.5s	remaining: 7.07s
619:	learn: 0.1543688	total: 11.5s	remaining: 7.05s
620:	learn: 0.1543618	total: 11.5s	remaining: 7.03s
621:	learn: 0.1543419	total: 11.5s	remaining: 7.01s
622:	learn: 0.1543212	total: 11.6s	remaining: 6.99s
623:	learn: 0.1542919	total: 11.6s	remaining: 6.97s
624:	learn: 0.1542637	total: 11.6s	remaining: 6.95s
625:	learn: 0.1542216	total: 11.6s	remaining: 6.94s
626:	learn: 0.1542126	total: 11.6s	remaining: 6.92s
627:	learn: 0.1541948	total: 11.6s	remaining: 6.9s
628:	learn: 0.1541746	total: 11.7s	remaining: 6.88s
629:	learn: 0.1541649	total: 11.7s	remaining: 6.86s
630:	learn: 0.1541362	total: 11.7s	remaining: 6.84s
631:	learn: 0.1541119	total: 11.7s	remaining: 6.82s
632:	learn: 0.1540999	total: 11.7s	remaining: 6.8s
633:	learn: 0.1540814	total: 11.7s	remaining: 6.78s
634:	learn: 0.1540688	total: 11.8s	remaining: 6.76s
635:	learn: 0.1540461	total: 11.8s	remaining: 6.75s
636:	learn: 0.1540230	total: 11.8s	remaining: 6.72s
637:	learn: 0.1540084	total: 11.8s	remaining: 6.71s
638:	learn: 0.1539905	total: 11.8s	remaining: 6.69s
639:	learn: 0.1539815	total: 11.9s	remaining: 6.67s
640:	learn: 0.1539650	total: 11.9s	remaining: 6.65s
641:	learn: 0.1539517	total: 11.9s	remaining: 6.63s
642:	learn: 0.1539399	total: 11.9s	remaining: 6.61s
643:	learn: 0.1539102	total: 11.9s	remaining: 6.59s
644:	learn: 0.1538846	total: 11.9s	remaining: 6.57s
645:	learn: 0.1538503	total: 12s	remaining: 6.56s
646:	learn: 0.1538352	total: 12s	remaining: 6.54s
647:	learn: 0.1538241	total: 12s	remaining: 6.52s
648:	learn: 0.1538049	total: 12s	remaining: 6.5s
649:	learn: 0.1537868	total: 12s	remaining: 6.48s
650:	learn: 0.1537732	total: 12.1s	remaining: 6.46s
651:	learn: 0.1537363	total: 12.1s	remaining: 6.44s
652:	learn: 0.1537183	total: 12.1s	remaining: 6.42s
653:	learn: 0.1537042	total: 12.1s	remaining: 6.4s
654:	learn: 0.1536943	total: 12.1s	remaining: 6.38s
655:	learn: 0.1536827	total: 12.1s	remaining: 6.37s
656:	learn: 0.1536668	total: 12.2s	remaining: 6.34s
657:	learn: 0.1536437	total: 12.2s	remaining: 6.33s
658:	learn: 0.1536296	total: 12.2s	remaining: 6.3s
659:	learn: 0.1536239	total: 12.2s	remaining: 6.29s
660:	learn: 0.1536048	total: 12.2s	remaining: 6.27s
661:	learn: 0.1535828	total: 12.2s	remaining: 6.25s
662:	learn: 0.1535463	total: 12.3s	remaining: 6.23s
663:	learn: 0.1535092	total: 12.3s	remaining: 6.21s
664:	learn: 0.1534893	total: 12.3s	remaining: 6.19s
665:	learn: 0.1534628	total: 12.3s	remaining: 6.17s
666:	learn: 0.1534271	total: 12.3s	remaining: 6.15s
667:	learn: 0.1534045	total: 12.3s	remaining: 6.13s
668:	learn: 0.1533887	total: 12.4s	remaining: 6.11s
669:	learn: 0.1533553	total: 12.4s	remaining: 6.09s
670:	learn: 0.1533341	total: 12.4s	remaining: 6.08s
671:	learn: 0.1533226	total: 12.4s	remaining: 6.06s
672:	learn: 0.1533052	total: 12.4s	remaining: 6.04s
673:	learn: 0.1532969	total: 12.5s	remaining: 6.02s
674:	learn: 0.1532787	total: 12.5s	remaining: 6s
675:	learn: 0.1532618	total: 12.5s	remaining: 5.99s
676:	learn: 0.1532492	total: 12.5s	remaining: 5.97s
677:	learn: 0.1532229	total: 12.5s	remaining: 5.95s
678:	learn: 0.1532124	total: 12.6s	remaining: 5.93s
679:	learn: 0.1532010	total: 12.6s	remaining: 5.91s
680:	learn: 0.1531886	total: 12.6s	remaining: 5.89s
681:	learn: 0.1531682	total: 12.6s	remaining: 5.87s
682:	learn: 0.1531492	total: 12.6s	remaining: 5.85s
683:	learn: 0.1531228	total: 12.6s	remaining: 5.83s
684:	learn: 0.1531004	total: 12.6s	remaining: 5.81s
685:	learn: 0.1530848	total: 12.7s	remaining: 5.79s
686:	learn: 0.1530678	total: 12.7s	remaining: 5.78s
687:	learn: 0.1530442	total: 12.7s	remaining: 5.76s
688:	learn: 0.1530224	total: 12.7s	remaining: 5.74s
689:	learn: 0.1530071	total: 12.7s	remaining: 5.72s
690:	learn: 0.1529899	total: 12.7s	remaining: 5.7s
691:	learn: 0.1529699	total: 12.8s	remaining: 5.68s
692:	learn: 0.1529542	total: 12.8s	remaining: 5.66s
693:	learn: 0.1529315	total: 12.8s	remaining: 5.64s
694:	learn: 0.1529151	total: 12.8s	remaining: 5.62s
695:	learn: 0.1528929	total: 12.8s	remaining: 5.6s
696:	learn: 0.1528716	total: 12.8s	remaining: 5.58s
697:	learn: 0.1528596	total: 12.9s	remaining: 5.57s
698:	learn: 0.1528381	total: 12.9s	remaining: 5.55s
699:	learn: 0.1528207	total: 12.9s	remaining: 5.53s
700:	learn: 0.1528003	total: 12.9s	remaining: 5.51s
701:	learn: 0.1527619	total: 12.9s	remaining: 5.49s
702:	learn: 0.1527469	total: 12.9s	remaining: 5.47s
703:	learn: 0.1527137	total: 13s	remaining: 5.45s
704:	learn: 0.1526971	total: 13s	remaining: 5.43s
705:	learn: 0.1526804	total: 13s	remaining: 5.41s
706:	learn: 0.1526634	total: 13s	remaining: 5.39s
707:	learn: 0.1526588	total: 13s	remaining: 5.37s
708:	learn: 0.1526244	total: 13s	remaining: 5.35s
709:	learn: 0.1526052	total: 13.1s	remaining: 5.34s
710:	learn: 0.1525908	total: 13.1s	remaining: 5.32s
711:	learn: 0.1525647	total: 13.1s	remaining: 5.3s
712:	learn: 0.1525460	total: 13.1s	remaining: 5.28s
713:	learn: 0.1525342	total: 13.1s	remaining: 5.26s
714:	learn: 0.1524789	total: 13.1s	remaining: 5.24s
715:	learn: 0.1524569	total: 13.2s	remaining: 5.22s
716:	learn: 0.1524368	total: 13.2s	remaining: 5.2s
717:	learn: 0.1524231	total: 13.2s	remaining: 5.18s
718:	learn: 0.1523983	total: 13.2s	remaining: 5.16s
719:	learn: 0.1523885	total: 13.2s	remaining: 5.14s
720:	learn: 0.1523728	total: 13.2s	remaining: 5.12s
721:	learn: 0.1523660	total: 13.3s	remaining: 5.1s
722:	learn: 0.1523537	total: 13.3s	remaining: 5.08s
723:	learn: 0.1523340	total: 13.3s	remaining: 5.06s
724:	learn: 0.1523133	total: 13.3s	remaining: 5.04s
725:	learn: 0.1522868	total: 13.3s	remaining: 5.02s
726:	learn: 0.1522745	total: 13.3s	remaining: 5s
727:	learn: 0.1522611	total: 13.3s	remaining: 4.98s
728:	learn: 0.1522472	total: 13.4s	remaining: 4.96s
729:	learn: 0.1522348	total: 13.4s	remaining: 4.95s
730:	learn: 0.1522242	total: 13.4s	remaining: 4.93s
731:	learn: 0.1522094	total: 13.4s	remaining: 4.91s
732:	learn: 0.1521929	total: 13.4s	remaining: 4.89s
733:	learn: 0.1521672	total: 13.4s	remaining: 4.87s
734:	learn: 0.1521545	total: 13.5s	remaining: 4.85s
735:	learn: 0.1521378	total: 13.5s	remaining: 4.83s
736:	learn: 0.1521095	total: 13.5s	remaining: 4.82s
737:	learn: 0.1520934	total: 13.5s	remaining: 4.8s
738:	learn: 0.1520686	total: 13.5s	remaining: 4.78s
739:	learn: 0.1520594	total: 13.5s	remaining: 4.76s
740:	learn: 0.1520472	total: 13.6s	remaining: 4.74s
741:	learn: 0.1520340	total: 13.6s	remaining: 4.72s
742:	learn: 0.1520095	total: 13.6s	remaining: 4.7s
743:	learn: 0.1519951	total: 13.6s	remaining: 4.68s
744:	learn: 0.1519769	total: 13.6s	remaining: 4.66s
745:	learn: 0.1519643	total: 13.6s	remaining: 4.64s
746:	learn: 0.1519542	total: 13.7s	remaining: 4.63s
747:	learn: 0.1519435	total: 13.7s	remaining: 4.61s
748:	learn: 0.1519293	total: 13.7s	remaining: 4.59s
749:	learn: 0.1518922	total: 13.7s	remaining: 4.57s
750:	learn: 0.1518659	total: 13.7s	remaining: 4.55s
751:	learn: 0.1518453	total: 13.7s	remaining: 4.53s
752:	learn: 0.1518219	total: 13.8s	remaining: 4.51s
753:	learn: 0.1517951	total: 13.8s	remaining: 4.49s
754:	learn: 0.1517534	total: 13.8s	remaining: 4.47s
755:	learn: 0.1517323	total: 13.8s	remaining: 4.46s
756:	learn: 0.1517118	total: 13.8s	remaining: 4.44s
757:	learn: 0.1516986	total: 13.8s	remaining: 4.42s
758:	learn: 0.1516769	total: 13.9s	remaining: 4.4s
759:	learn: 0.1516473	total: 13.9s	remaining: 4.38s
760:	learn: 0.1516361	total: 13.9s	remaining: 4.36s
761:	learn: 0.1516215	total: 13.9s	remaining: 4.34s
762:	learn: 0.1516134	total: 13.9s	remaining: 4.33s
763:	learn: 0.1516067	total: 13.9s	remaining: 4.31s
764:	learn: 0.1515974	total: 14s	remaining: 4.29s
765:	learn: 0.1515668	total: 14s	remaining: 4.27s
766:	learn: 0.1515596	total: 14s	remaining: 4.25s
767:	learn: 0.1515402	total: 14s	remaining: 4.24s
768:	learn: 0.1515288	total: 14s	remaining: 4.22s
769:	learn: 0.1515182	total: 14s	remaining: 4.2s
770:	learn: 0.1515082	total: 14.1s	remaining: 4.18s
771:	learn: 0.1514954	total: 14.1s	remaining: 4.16s
772:	learn: 0.1514690	total: 14.1s	remaining: 4.14s
773:	learn: 0.1514391	total: 14.1s	remaining: 4.12s
774:	learn: 0.1514275	total: 14.1s	remaining: 4.1s
775:	learn: 0.1513897	total: 14.1s	remaining: 4.08s
776:	learn: 0.1513673	total: 14.2s	remaining: 4.06s
777:	learn: 0.1513454	total: 14.2s	remaining: 4.04s
778:	learn: 0.1513268	total: 14.2s	remaining: 4.03s
779:	learn: 0.1513098	total: 14.2s	remaining: 4.01s
780:	learn: 0.1512866	total: 14.2s	remaining: 3.99s
781:	learn: 0.1512686	total: 14.2s	remaining: 3.97s
782:	learn: 0.1512378	total: 14.2s	remaining: 3.95s
783:	learn: 0.1512093	total: 14.3s	remaining: 3.93s
784:	learn: 0.1511856	total: 14.3s	remaining: 3.91s
785:	learn: 0.1511767	total: 14.3s	remaining: 3.89s
786:	learn: 0.1511553	total: 14.3s	remaining: 3.87s
787:	learn: 0.1511221	total: 14.3s	remaining: 3.86s
788:	learn: 0.1511044	total: 14.3s	remaining: 3.84s
789:	learn: 0.1510971	total: 14.4s	remaining: 3.82s
790:	learn: 0.1510653	total: 14.4s	remaining: 3.8s
791:	learn: 0.1510506	total: 14.4s	remaining: 3.78s
792:	learn: 0.1510398	total: 14.4s	remaining: 3.76s
793:	learn: 0.1510231	total: 14.4s	remaining: 3.74s
794:	learn: 0.1510067	total: 14.4s	remaining: 3.73s
795:	learn: 0.1509769	total: 14.5s	remaining: 3.71s
796:	learn: 0.1509483	total: 14.5s	remaining: 3.69s
797:	learn: 0.1509244	total: 14.5s	remaining: 3.67s
798:	learn: 0.1509098	total: 14.5s	remaining: 3.65s
799:	learn: 0.1508912	total: 14.5s	remaining: 3.63s
800:	learn: 0.1508631	total: 14.5s	remaining: 3.61s
801:	learn: 0.1508543	total: 14.6s	remaining: 3.59s
802:	learn: 0.1508460	total: 14.6s	remaining: 3.58s
803:	learn: 0.1508402	total: 14.6s	remaining: 3.56s
804:	learn: 0.1508266	total: 14.6s	remaining: 3.54s
805:	learn: 0.1508167	total: 14.6s	remaining: 3.52s
806:	learn: 0.1508063	total: 14.6s	remaining: 3.5s
807:	learn: 0.1507899	total: 14.7s	remaining: 3.48s
808:	learn: 0.1507800	total: 14.7s	remaining: 3.46s
809:	learn: 0.1507572	total: 14.7s	remaining: 3.45s
810:	learn: 0.1507416	total: 14.7s	remaining: 3.43s
811:	learn: 0.1507312	total: 14.7s	remaining: 3.41s
812:	learn: 0.1507116	total: 14.8s	remaining: 3.39s
813:	learn: 0.1506879	total: 14.8s	remaining: 3.38s
814:	learn: 0.1506811	total: 14.8s	remaining: 3.36s
815:	learn: 0.1506685	total: 14.8s	remaining: 3.34s
816:	learn: 0.1506578	total: 14.8s	remaining: 3.32s
817:	learn: 0.1506428	total: 14.8s	remaining: 3.3s
818:	learn: 0.1506345	total: 14.9s	remaining: 3.28s
819:	learn: 0.1506231	total: 14.9s	remaining: 3.26s
820:	learn: 0.1506135	total: 14.9s	remaining: 3.25s
821:	learn: 0.1505965	total: 14.9s	remaining: 3.23s
822:	learn: 0.1505805	total: 14.9s	remaining: 3.21s
823:	learn: 0.1505495	total: 14.9s	remaining: 3.19s
824:	learn: 0.1505418	total: 15s	remaining: 3.17s
825:	learn: 0.1505203	total: 15s	remaining: 3.15s
826:	learn: 0.1505080	total: 15s	remaining: 3.14s
827:	learn: 0.1504862	total: 15s	remaining: 3.12s
828:	learn: 0.1504691	total: 15s	remaining: 3.1s
829:	learn: 0.1504492	total: 15s	remaining: 3.08s
830:	learn: 0.1504250	total: 15.1s	remaining: 3.06s
831:	learn: 0.1504134	total: 15.1s	remaining: 3.04s
832:	learn: 0.1504061	total: 15.1s	remaining: 3.03s
833:	learn: 0.1503796	total: 15.1s	remaining: 3.01s
834:	learn: 0.1503577	total: 15.1s	remaining: 2.99s
835:	learn: 0.1503325	total: 15.2s	remaining: 2.97s
836:	learn: 0.1503190	total: 15.2s	remaining: 2.95s
837:	learn: 0.1503136	total: 15.2s	remaining: 2.94s
838:	learn: 0.1502883	total: 15.2s	remaining: 2.92s
839:	learn: 0.1502803	total: 15.2s	remaining: 2.9s
840:	learn: 0.1502587	total: 15.2s	remaining: 2.88s
841:	learn: 0.1502482	total: 15.3s	remaining: 2.86s
842:	learn: 0.1502397	total: 15.3s	remaining: 2.84s
843:	learn: 0.1502171	total: 15.3s	remaining: 2.83s
844:	learn: 0.1501965	total: 15.3s	remaining: 2.81s
845:	learn: 0.1501732	total: 15.3s	remaining: 2.79s
846:	learn: 0.1501640	total: 15.3s	remaining: 2.77s
847:	learn: 0.1501336	total: 15.3s	remaining: 2.75s
848:	learn: 0.1500921	total: 15.4s	remaining: 2.73s
849:	learn: 0.1500724	total: 15.4s	remaining: 2.71s
850:	learn: 0.1500533	total: 15.4s	remaining: 2.69s
851:	learn: 0.1500395	total: 15.4s	remaining: 2.68s
852:	learn: 0.1500176	total: 15.4s	remaining: 2.66s
853:	learn: 0.1500037	total: 15.4s	remaining: 2.64s
854:	learn: 0.1499795	total: 15.5s	remaining: 2.62s
855:	learn: 0.1499557	total: 15.5s	remaining: 2.6s
856:	learn: 0.1499345	total: 15.5s	remaining: 2.58s
857:	learn: 0.1499250	total: 15.5s	remaining: 2.57s
858:	learn: 0.1499004	total: 15.5s	remaining: 2.55s
859:	learn: 0.1498821	total: 15.5s	remaining: 2.53s
860:	learn: 0.1498595	total: 15.6s	remaining: 2.51s
861:	learn: 0.1498492	total: 15.6s	remaining: 2.49s
862:	learn: 0.1498295	total: 15.6s	remaining: 2.47s
863:	learn: 0.1498082	total: 15.6s	remaining: 2.46s
864:	learn: 0.1497837	total: 15.6s	remaining: 2.44s
865:	learn: 0.1497614	total: 15.7s	remaining: 2.42s
866:	learn: 0.1497483	total: 15.7s	remaining: 2.4s
867:	learn: 0.1497306	total: 15.7s	remaining: 2.38s
868:	learn: 0.1497127	total: 15.7s	remaining: 2.37s
869:	learn: 0.1496935	total: 15.7s	remaining: 2.35s
870:	learn: 0.1496631	total: 15.7s	remaining: 2.33s
871:	learn: 0.1496541	total: 15.8s	remaining: 2.31s
872:	learn: 0.1496305	total: 15.8s	remaining: 2.29s
873:	learn: 0.1496175	total: 15.8s	remaining: 2.28s
874:	learn: 0.1496028	total: 15.8s	remaining: 2.26s
875:	learn: 0.1495945	total: 15.8s	remaining: 2.24s
876:	learn: 0.1495695	total: 15.8s	remaining: 2.22s
877:	learn: 0.1495572	total: 15.8s	remaining: 2.2s
878:	learn: 0.1495403	total: 15.9s	remaining: 2.18s
879:	learn: 0.1495214	total: 15.9s	remaining: 2.17s
880:	learn: 0.1495170	total: 15.9s	remaining: 2.15s
881:	learn: 0.1494879	total: 15.9s	remaining: 2.13s
882:	learn: 0.1494739	total: 15.9s	remaining: 2.11s
883:	learn: 0.1494590	total: 15.9s	remaining: 2.09s
884:	learn: 0.1494137	total: 16s	remaining: 2.07s
885:	learn: 0.1494019	total: 16s	remaining: 2.06s
886:	learn: 0.1493886	total: 16s	remaining: 2.04s
887:	learn: 0.1493811	total: 16s	remaining: 2.02s
888:	learn: 0.1493629	total: 16s	remaining: 2s
889:	learn: 0.1493437	total: 16s	remaining: 1.98s
890:	learn: 0.1493371	total: 16s	remaining: 1.96s
891:	learn: 0.1493104	total: 16.1s	remaining: 1.95s
892:	learn: 0.1493012	total: 16.1s	remaining: 1.93s
893:	learn: 0.1492920	total: 16.1s	remaining: 1.91s
894:	learn: 0.1492823	total: 16.1s	remaining: 1.89s
895:	learn: 0.1492723	total: 16.1s	remaining: 1.87s
896:	learn: 0.1492507	total: 16.2s	remaining: 1.85s
897:	learn: 0.1492439	total: 16.2s	remaining: 1.84s
898:	learn: 0.1492284	total: 16.2s	remaining: 1.82s
899:	learn: 0.1492144	total: 16.2s	remaining: 1.8s
900:	learn: 0.1492095	total: 16.2s	remaining: 1.78s
901:	learn: 0.1491877	total: 16.2s	remaining: 1.76s
902:	learn: 0.1491794	total: 16.3s	remaining: 1.75s
903:	learn: 0.1491578	total: 16.3s	remaining: 1.73s
904:	learn: 0.1491477	total: 16.3s	remaining: 1.71s
905:	learn: 0.1491370	total: 16.3s	remaining: 1.69s
906:	learn: 0.1491329	total: 16.3s	remaining: 1.67s
907:	learn: 0.1491238	total: 16.3s	remaining: 1.66s
908:	learn: 0.1491111	total: 16.4s	remaining: 1.64s
909:	learn: 0.1491013	total: 16.4s	remaining: 1.62s
910:	learn: 0.1490850	total: 16.4s	remaining: 1.6s
911:	learn: 0.1490730	total: 16.4s	remaining: 1.58s
912:	learn: 0.1490664	total: 16.4s	remaining: 1.56s
913:	learn: 0.1490211	total: 16.4s	remaining: 1.55s
914:	learn: 0.1490016	total: 16.5s	remaining: 1.53s
915:	learn: 0.1489886	total: 16.5s	remaining: 1.51s
916:	learn: 0.1489774	total: 16.5s	remaining: 1.49s
917:	learn: 0.1489683	total: 16.5s	remaining: 1.47s
918:	learn: 0.1489544	total: 16.5s	remaining: 1.46s
919:	learn: 0.1489388	total: 16.5s	remaining: 1.44s
920:	learn: 0.1489269	total: 16.6s	remaining: 1.42s
921:	learn: 0.1489052	total: 16.6s	remaining: 1.4s
922:	learn: 0.1488855	total: 16.6s	remaining: 1.38s
923:	learn: 0.1488576	total: 16.6s	remaining: 1.36s
924:	learn: 0.1488470	total: 16.6s	remaining: 1.35s
925:	learn: 0.1488379	total: 16.6s	remaining: 1.33s
926:	learn: 0.1488243	total: 16.7s	remaining: 1.31s
927:	learn: 0.1488101	total: 16.7s	remaining: 1.29s
928:	learn: 0.1488014	total: 16.7s	remaining: 1.27s
929:	learn: 0.1487871	total: 16.7s	remaining: 1.26s
930:	learn: 0.1487798	total: 16.7s	remaining: 1.24s
931:	learn: 0.1487668	total: 16.7s	remaining: 1.22s
932:	learn: 0.1487615	total: 16.8s	remaining: 1.2s
933:	learn: 0.1487467	total: 16.8s	remaining: 1.18s
934:	learn: 0.1487278	total: 16.8s	remaining: 1.17s
935:	learn: 0.1487223	total: 16.8s	remaining: 1.15s
936:	learn: 0.1487036	total: 16.8s	remaining: 1.13s
937:	learn: 0.1486979	total: 16.8s	remaining: 1.11s
938:	learn: 0.1486831	total: 16.9s	remaining: 1.09s
939:	learn: 0.1486697	total: 16.9s	remaining: 1.08s
940:	learn: 0.1486419	total: 16.9s	remaining: 1.06s
941:	learn: 0.1486283	total: 16.9s	remaining: 1.04s
942:	learn: 0.1486131	total: 16.9s	remaining: 1.02s
943:	learn: 0.1485854	total: 16.9s	remaining: 1s
944:	learn: 0.1485717	total: 16.9s	remaining: 986ms
945:	learn: 0.1485586	total: 17s	remaining: 968ms
946:	learn: 0.1485364	total: 17s	remaining: 950ms
947:	learn: 0.1485208	total: 17s	remaining: 932ms
948:	learn: 0.1485096	total: 17s	remaining: 914ms
949:	learn: 0.1485021	total: 17s	remaining: 896ms
950:	learn: 0.1484884	total: 17s	remaining: 878ms
951:	learn: 0.1484602	total: 17.1s	remaining: 860ms
952:	learn: 0.1484366	total: 17.1s	remaining: 842ms
953:	learn: 0.1484289	total: 17.1s	remaining: 824ms
954:	learn: 0.1484217	total: 17.1s	remaining: 806ms
955:	learn: 0.1484086	total: 17.1s	remaining: 788ms
956:	learn: 0.1483961	total: 17.1s	remaining: 770ms
957:	learn: 0.1483810	total: 17.2s	remaining: 752ms
958:	learn: 0.1483611	total: 17.2s	remaining: 734ms
959:	learn: 0.1483402	total: 17.2s	remaining: 716ms
960:	learn: 0.1483107	total: 17.2s	remaining: 698ms
961:	learn: 0.1482925	total: 17.2s	remaining: 680ms
962:	learn: 0.1482621	total: 17.2s	remaining: 662ms
963:	learn: 0.1482419	total: 17.3s	remaining: 644ms
964:	learn: 0.1482235	total: 17.3s	remaining: 626ms
965:	learn: 0.1482151	total: 17.3s	remaining: 608ms
966:	learn: 0.1482077	total: 17.3s	remaining: 590ms
967:	learn: 0.1481980	total: 17.3s	remaining: 572ms
968:	learn: 0.1481590	total: 17.3s	remaining: 554ms
969:	learn: 0.1481387	total: 17.3s	remaining: 537ms
970:	learn: 0.1481345	total: 17.4s	remaining: 519ms
971:	learn: 0.1481229	total: 17.4s	remaining: 501ms
972:	learn: 0.1481028	total: 17.4s	remaining: 483ms
973:	learn: 0.1480933	total: 17.4s	remaining: 465ms
974:	learn: 0.1480780	total: 17.4s	remaining: 447ms
975:	learn: 0.1480519	total: 17.4s	remaining: 429ms
976:	learn: 0.1480362	total: 17.5s	remaining: 411ms
977:	learn: 0.1480306	total: 17.5s	remaining: 393ms
978:	learn: 0.1480206	total: 17.5s	remaining: 375ms
979:	learn: 0.1480137	total: 17.5s	remaining: 357ms
980:	learn: 0.1480053	total: 17.5s	remaining: 339ms
981:	learn: 0.1479807	total: 17.5s	remaining: 321ms
982:	learn: 0.1479527	total: 17.6s	remaining: 304ms
983:	learn: 0.1479460	total: 17.6s	remaining: 286ms
984:	learn: 0.1479380	total: 17.6s	remaining: 268ms
985:	learn: 0.1479214	total: 17.6s	remaining: 250ms
986:	learn: 0.1479126	total: 17.6s	remaining: 232ms
987:	learn: 0.1478943	total: 17.6s	remaining: 214ms
988:	learn: 0.1478803	total: 17.6s	remaining: 196ms
989:	learn: 0.1478711	total: 17.7s	remaining: 178ms
990:	learn: 0.1478326	total: 17.7s	remaining: 161ms
991:	learn: 0.1478291	total: 17.7s	remaining: 143ms
992:	learn: 0.1478235	total: 17.7s	remaining: 125ms
993:	learn: 0.1478091	total: 17.7s	remaining: 107ms
994:	learn: 0.1477921	total: 17.7s	remaining: 89.2ms
995:	learn: 0.1477849	total: 17.8s	remaining: 71.3ms
996:	learn: 0.1477669	total: 17.8s	remaining: 53.5ms
997:	learn: 0.1477516	total: 17.8s	remaining: 35.7ms
998:	learn: 0.1477113	total: 17.8s	remaining: 17.8ms
999:	learn: 0.1476888	total: 17.8s	remaining: 0us
In [85]:
print('Train F1-Score')
print(f1_cb_tr*100)
print('Validation F1-Score')
print(f1_cb_val*100)
Train F1-Score
54.79268836379848
Validation F1-Score
51.70682730923696

Light GBM

In [64]:
LGBM=LGBMClassifier()
LGBM.fit(X_train,y_train)
y_pred_LGBM_val=LGBM.predict(X_val)
y_pred_LGBM_tr=LGBM.predict(X_train)

f1_lgbm_val=f1_score(y_val,y_pred_LGBM_val)
f1_lgbm_tr=f1_score(y_train,y_pred_LGBM_tr)
In [65]:
print('Train F1-Score')
print(f1_lgbm_tr*100)
print('Validation F1-Score')
print(f1_lgbm_val*100)
Train F1-Score
52.98340062808433
Validation F1-Score
50.25432349949135

Stacking

In [97]:
xgb=XGBClassifier(learning_rate=0.1,n_estimators=150,max_depth=5,min_child_weight=5,gamma=0.3,nthread=8,subsample=0.8,
                  colsample_bytree=0.8,objective='binary:logistic',scale_pos_weight=3,seed=12)
gb=GradientBoostingClassifier(n_estimators=300,max_features=0.9,learning_rate=0.25,max_depth=4,
                              min_samples_leaf=2,subsample=1,verbose=0,random_state=12)
lr=LogisticRegression()

blended_classifier=StackingClassifier(classifiers=[xgb,gb],
                                      meta_classifier=lr)

blended_classifier.fit(X_train,y_train)
y_pred_BC_val=blended_classifier.predict(X_val)
y_pred_BC_tr=blended_classifier.predict(X_train)

f1_bcs_tuned_val=f1_score(y_val,y_pred_BC_val)
f1_bcs_tuned_tr=f1_score(y_train,y_pred_BC_tr)
In [98]:
print('Train F1-Score')
print(f1_bcs_tuned_tr*100)
print('Validation F1-Score')
print(f1_bcs_tuned_val*100)
Train F1-Score
58.61566484517304
Validation F1-Score
50.545759865659114

Ensemble Voting Classifier

In [101]:
xgb=XGBClassifier(learning_rate=0.1,n_estimators=150,max_depth=5,min_child_weight=5,gamma=0.3,nthread=8,subsample=0.8,
                  colsample_bytree=0.8,objective='binary:logistic',scale_pos_weight=3,seed=12)
gb=GradientBoostingClassifier(n_estimators=300,max_features=0.9,learning_rate=0.25,max_depth=4,
                              min_samples_leaf=2,subsample=1,verbose=0,random_state=12)


evc=EnsembleVoteClassifier(clfs=[xgb,gb],voting='soft')

evc.fit(X_train,y_train)
y_pred_evc_val=evc.predict(X_val)
y_pred_evc_tr=evc.predict(X_train)

f1_evc_tuned_val=f1_score(y_val,y_pred_evc_val)
f1_evc_tuned_tr=f1_score(y_train,y_pred_evc_tr)
In [102]:
print('Train F1-Score')
print(f1_evc_tuned_tr*100)
print('Validation F1-Score')
print(f1_evc_tuned_val*100)
Train F1-Score
58.90241750485266
Validation F1-Score
51.85185185185186

ANN

In [60]:
import keras 
from keras.models import Sequential
from keras.layers import Dense
from keras.layers import Dropout
Using TensorFlow backend.

Defining Errro Metric

In [61]:
from keras import backend as K
def recall_m(y_true,y_pred):
        true_positives=K.sum(K.round(K.clip(y_true * y_pred,0,1)))
        possible_positives=K.sum(K.round(K.clip(y_true,0,1)))
        recall=true_positives / (possible_positives + K.epsilon())
        return recall
def precision_m(y_true,y_pred):
        true_positives=K.sum(K.round(K.clip(y_true * y_pred, 0,1)))
        predicted_positives=K.sum(K.round(K.clip(y_pred,0,1)))
        precision=true_positives / (predicted_positives + K.epsilon())
        return precision
def f1_m(y_true,y_pred):
    precision=precision_m(y_true,y_pred)
    recall=recall_m(y_true,y_pred)
    return 2*((precision*recall)/(precision+recall+K.epsilon()))

Preprocessing

In [62]:
# For encoding the Target
ohe=OneHotEncoder(handle_unknown='ignore')

# Fit on Train
OHE=ohe.fit(y_train.values.reshape(-1,1))

# Transform on Validation and Train
OHE_target_train_ann=OHE.transform(y_train.values.reshape(-1,1)).toarray()
OHE_target_val_ann=OHE.transform(y_val.values.reshape(-1,1)).toarray()

# Check the shape
OHE_target_val_ann.shape
Out[62]:
(18087, 2)
In [81]:
classifier_ANN=Sequential()

classifier_ANN.add(Dense(units=10,kernel_initializer='uniform',activation='relu',input_dim=19))

classifier_ANN.add(Dense(units=5,kernel_initializer='uniform',activation='relu'))

classifier_ANN.add(Dense(units=2,kernel_initializer='uniform',activation='softmax'))

classifier_ANN.compile(optimizer='adamax',loss='categorical_crossentropy',metrics=[f1_m])

classifier_ANN.summary()
Model: "sequential_2"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_5 (Dense)              (None, 10)                200       
_________________________________________________________________
dense_6 (Dense)              (None, 5)                 55        
_________________________________________________________________
dense_7 (Dense)              (None, 2)                 12        
=================================================================
Total params: 267
Trainable params: 267
Non-trainable params: 0
_________________________________________________________________
In [82]:
history=classifier_ANN.fit(X_train,OHE_target_train_ann,validation_data=(X_val,OHE_target_val_ann),batch_size=20,epochs=100)
Train on 36721 samples, validate on 18087 samples
Epoch 1/100
36721/36721 [==============================] - 3s 87us/step - loss: 0.3196 - f1_m: 0.9122 - val_loss: 0.2828 - val_f1_m: 0.9203
Epoch 2/100
36721/36721 [==============================] - 3s 93us/step - loss: 0.2840 - f1_m: 0.9122 - val_loss: 0.2554 - val_f1_m: 0.9203
Epoch 3/100
36721/36721 [==============================] - 5s 125us/step - loss: 0.2615 - f1_m: 0.9117 - val_loss: 0.2433 - val_f1_m: 0.9203
Epoch 4/100
36721/36721 [==============================] - 4s 96us/step - loss: 0.2540 - f1_m: 0.9122 - val_loss: 0.2396 - val_f1_m: 0.9203
Epoch 5/100
36721/36721 [==============================] - 3s 88us/step - loss: 0.2499 - f1_m: 0.9122 - val_loss: 0.2333 - val_f1_m: 0.9203
Epoch 6/100
36721/36721 [==============================] - 5s 130us/step - loss: 0.2461 - f1_m: 0.9122 - val_loss: 0.2299 - val_f1_m: 0.9203
Epoch 7/100
36721/36721 [==============================] - 5s 142us/step - loss: 0.2427 - f1_m: 0.9122 - val_loss: 0.2252 - val_f1_m: 0.9203
Epoch 8/100
36721/36721 [==============================] - 3s 94us/step - loss: 0.2390 - f1_m: 0.9122 - val_loss: 0.2221 - val_f1_m: 0.9203
Epoch 9/100
36721/36721 [==============================] - 4s 96us/step - loss: 0.2354 - f1_m: 0.9122 - val_loss: 0.2216 - val_f1_m: 0.9203
Epoch 10/100
36721/36721 [==============================] - 4s 96us/step - loss: 0.2320 - f1_m: 0.9122 - val_loss: 0.2169 - val_f1_m: 0.9203
Epoch 11/100
36721/36721 [==============================] - 4s 118us/step - loss: 0.2286 - f1_m: 0.9122 - val_loss: 0.2152 - val_f1_m: 0.9203
Epoch 12/100
36721/36721 [==============================] - 4s 98us/step - loss: 0.2261 - f1_m: 0.9122 - val_loss: 0.2144 - val_f1_m: 0.9203
Epoch 13/100
36721/36721 [==============================] - 3s 90us/step - loss: 0.2235 - f1_m: 0.9162 - val_loss: 0.2130 - val_f1_m: 0.9239
Epoch 14/100
36721/36721 [==============================] - 3s 89us/step - loss: 0.2204 - f1_m: 0.9171 - val_loss: 0.2079 - val_f1_m: 0.9257
Epoch 15/100
36721/36721 [==============================] - 3s 89us/step - loss: 0.2184 - f1_m: 0.9180 - val_loss: 0.2037 - val_f1_m: 0.9261
Epoch 16/100
36721/36721 [==============================] - 3s 91us/step - loss: 0.2147 - f1_m: 0.9195 - val_loss: 0.1992 - val_f1_m: 0.9270
Epoch 17/100
36721/36721 [==============================] - 3s 92us/step - loss: 0.2122 - f1_m: 0.9207 - val_loss: 0.1952 - val_f1_m: 0.9290
Epoch 18/100
36721/36721 [==============================] - 3s 90us/step - loss: 0.2095 - f1_m: 0.9228 - val_loss: 0.1932 - val_f1_m: 0.9297
Epoch 19/100
36721/36721 [==============================] - 3s 85us/step - loss: 0.2077 - f1_m: 0.9240 - val_loss: 0.1923 - val_f1_m: 0.9339
Epoch 20/100
36721/36721 [==============================] - 3s 89us/step - loss: 0.2057 - f1_m: 0.9257 - val_loss: 0.1914 - val_f1_m: 0.9334
Epoch 21/100
36721/36721 [==============================] - 3s 90us/step - loss: 0.2031 - f1_m: 0.9270 - val_loss: 0.1876 - val_f1_m: 0.9333
Epoch 22/100
36721/36721 [==============================] - 4s 100us/step - loss: 0.2027 - f1_m: 0.9281 - val_loss: 0.1883 - val_f1_m: 0.9365
Epoch 23/100
36721/36721 [==============================] - 3s 87us/step - loss: 0.2011 - f1_m: 0.9287 - val_loss: 0.1872 - val_f1_m: 0.9380
Epoch 24/100
36721/36721 [==============================] - 4s 101us/step - loss: 0.2004 - f1_m: 0.9297 - val_loss: 0.1847 - val_f1_m: 0.9377
Epoch 25/100
36721/36721 [==============================] - 3s 91us/step - loss: 0.1996 - f1_m: 0.9296 - val_loss: 0.2151 - val_f1_m: 0.9310
Epoch 26/100
36721/36721 [==============================] - 3s 84us/step - loss: 0.1995 - f1_m: 0.9308 - val_loss: 0.1930 - val_f1_m: 0.9398
Epoch 27/100
36721/36721 [==============================] - 3s 91us/step - loss: 0.1987 - f1_m: 0.9309 - val_loss: 0.1842 - val_f1_m: 0.9390
Epoch 28/100
36721/36721 [==============================] - 3s 91us/step - loss: 0.1992 - f1_m: 0.9311 - val_loss: 0.1924 - val_f1_m: 0.9335
Epoch 29/100
36721/36721 [==============================] - 3s 90us/step - loss: 0.1979 - f1_m: 0.9316 - val_loss: 0.1853 - val_f1_m: 0.9399
Epoch 30/100
36721/36721 [==============================] - 3s 91us/step - loss: 0.1977 - f1_m: 0.9315 - val_loss: 0.1844 - val_f1_m: 0.9413
Epoch 31/100
36721/36721 [==============================] - 3s 91us/step - loss: 0.1955 - f1_m: 0.9320 - val_loss: 0.1847 - val_f1_m: 0.9391
Epoch 32/100
36721/36721 [==============================] - 3s 91us/step - loss: 0.1980 - f1_m: 0.9318 - val_loss: 0.1789 - val_f1_m: 0.9396
Epoch 33/100
36721/36721 [==============================] - 3s 86us/step - loss: 0.1969 - f1_m: 0.9322 - val_loss: 0.1863 - val_f1_m: 0.9345
Epoch 34/100
36721/36721 [==============================] - 3s 88us/step - loss: 0.1971 - f1_m: 0.9325 - val_loss: 0.1790 - val_f1_m: 0.9392
Epoch 35/100
36721/36721 [==============================] - 3s 88us/step - loss: 0.1977 - f1_m: 0.9318 - val_loss: 0.1799 - val_f1_m: 0.9406
Epoch 36/100
36721/36721 [==============================] - 3s 87us/step - loss: 0.1970 - f1_m: 0.9324 - val_loss: 0.1783 - val_f1_m: 0.9399
Epoch 37/100
36721/36721 [==============================] - 3s 90us/step - loss: 0.1965 - f1_m: 0.9322 - val_loss: 0.1792 - val_f1_m: 0.9404
Epoch 38/100
36721/36721 [==============================] - 3s 91us/step - loss: 0.1968 - f1_m: 0.9326 - val_loss: 0.1792 - val_f1_m: 0.9388
Epoch 39/100
36721/36721 [==============================] - 3s 91us/step - loss: 0.1961 - f1_m: 0.9336 - val_loss: 0.1788 - val_f1_m: 0.9383
Epoch 40/100
36721/36721 [==============================] - 3s 86us/step - loss: 0.1965 - f1_m: 0.9330 - val_loss: 0.1966 - val_f1_m: 0.9389
Epoch 41/100
36721/36721 [==============================] - 3s 88us/step - loss: 0.1953 - f1_m: 0.9327 - val_loss: 0.1825 - val_f1_m: 0.9376
Epoch 42/100
36721/36721 [==============================] - 3s 87us/step - loss: 0.1956 - f1_m: 0.9330 - val_loss: 0.1786 - val_f1_m: 0.9391
Epoch 43/100
36721/36721 [==============================] - 3s 86us/step - loss: 0.1953 - f1_m: 0.9335 - val_loss: 0.1826 - val_f1_m: 0.9361
Epoch 44/100
36721/36721 [==============================] - 3s 88us/step - loss: 0.1954 - f1_m: 0.9327 - val_loss: 0.1837 - val_f1_m: 0.9414
Epoch 45/100
36721/36721 [==============================] - 3s 88us/step - loss: 0.1943 - f1_m: 0.9333 - val_loss: 0.1925 - val_f1_m: 0.9379
Epoch 46/100
36721/36721 [==============================] - 3s 92us/step - loss: 0.1960 - f1_m: 0.9332 - val_loss: 0.1806 - val_f1_m: 0.9380
Epoch 47/100
36721/36721 [==============================] - 3s 89us/step - loss: 0.1949 - f1_m: 0.9336 - val_loss: 0.1898 - val_f1_m: 0.9406
Epoch 48/100
36721/36721 [==============================] - 3s 84us/step - loss: 0.1952 - f1_m: 0.9334 - val_loss: 0.1828 - val_f1_m: 0.9372
Epoch 49/100
36721/36721 [==============================] - 3s 91us/step - loss: 0.1952 - f1_m: 0.9326 - val_loss: 0.1780 - val_f1_m: 0.9419
Epoch 50/100
36721/36721 [==============================] - 3s 92us/step - loss: 0.1951 - f1_m: 0.9336 - val_loss: 0.1830 - val_f1_m: 0.9387
Epoch 51/100
36721/36721 [==============================] - 3s 93us/step - loss: 0.1945 - f1_m: 0.9337 - val_loss: 0.1809 - val_f1_m: 0.9416
Epoch 52/100
36721/36721 [==============================] - 4s 96us/step - loss: 0.1950 - f1_m: 0.9336 - val_loss: 0.1767 - val_f1_m: 0.9406
Epoch 53/100
36721/36721 [==============================] - 4s 111us/step - loss: 0.1940 - f1_m: 0.9335 - val_loss: 0.1825 - val_f1_m: 0.9418
Epoch 54/100
36721/36721 [==============================] - 4s 119us/step - loss: 0.1953 - f1_m: 0.9331 - val_loss: 0.1877 - val_f1_m: 0.9416
Epoch 55/100
36721/36721 [==============================] - 5s 127us/step - loss: 0.1936 - f1_m: 0.9342 - val_loss: 0.1800 - val_f1_m: 0.9421
Epoch 56/100
36721/36721 [==============================] - 3s 82us/step - loss: 0.1946 - f1_m: 0.9338 - val_loss: 0.1799 - val_f1_m: 0.9404
Epoch 57/100
36721/36721 [==============================] - 3s 83us/step - loss: 0.1946 - f1_m: 0.9328 - val_loss: 0.1835 - val_f1_m: 0.9409
Epoch 58/100
36721/36721 [==============================] - 3s 88us/step - loss: 0.1934 - f1_m: 0.9338 - val_loss: 0.1787 - val_f1_m: 0.9422
Epoch 59/100
36721/36721 [==============================] - 3s 81us/step - loss: 0.1932 - f1_m: 0.9343 - val_loss: 0.1860 - val_f1_m: 0.9385
Epoch 60/100
36721/36721 [==============================] - 3s 89us/step - loss: 0.1943 - f1_m: 0.9338 - val_loss: 0.1777 - val_f1_m: 0.9421
Epoch 61/100
36721/36721 [==============================] - 3s 74us/step - loss: 0.1937 - f1_m: 0.9340 - val_loss: 0.1907 - val_f1_m: 0.9371
Epoch 62/100
36721/36721 [==============================] - 3s 87us/step - loss: 0.1940 - f1_m: 0.9332 - val_loss: 0.1802 - val_f1_m: 0.9432
Epoch 63/100
36721/36721 [==============================] - 4s 104us/step - loss: 0.1938 - f1_m: 0.9339 - val_loss: 0.1782 - val_f1_m: 0.9402
Epoch 64/100
36721/36721 [==============================] - 4s 98us/step - loss: 0.1941 - f1_m: 0.9344 - val_loss: 0.1760 - val_f1_m: 0.9418
Epoch 65/100
36721/36721 [==============================] - 3s 78us/step - loss: 0.1942 - f1_m: 0.9339 - val_loss: 0.1776 - val_f1_m: 0.9418
Epoch 66/100
36721/36721 [==============================] - 3s 83us/step - loss: 0.1938 - f1_m: 0.9334 - val_loss: 0.1782 - val_f1_m: 0.9417
Epoch 67/100
36721/36721 [==============================] - 3s 87us/step - loss: 0.1925 - f1_m: 0.9345 - val_loss: 0.1901 - val_f1_m: 0.9342
Epoch 68/100
36721/36721 [==============================] - 4s 96us/step - loss: 0.1937 - f1_m: 0.9347 - val_loss: 0.1956 - val_f1_m: 0.9376
Epoch 69/100
36721/36721 [==============================] - 4s 97us/step - loss: 0.1941 - f1_m: 0.9340 - val_loss: 0.1803 - val_f1_m: 0.9419
Epoch 70/100
36721/36721 [==============================] - 3s 76us/step - loss: 0.1933 - f1_m: 0.9344 - val_loss: 0.1766 - val_f1_m: 0.9411
Epoch 71/100
36721/36721 [==============================] - 3s 78us/step - loss: 0.1934 - f1_m: 0.9338 - val_loss: 0.1777 - val_f1_m: 0.9401
Epoch 72/100
36721/36721 [==============================] - 3s 81us/step - loss: 0.1932 - f1_m: 0.9345 - val_loss: 0.1760 - val_f1_m: 0.9407
Epoch 73/100
36721/36721 [==============================] - 3s 86us/step - loss: 0.1936 - f1_m: 0.9344 - val_loss: 0.1798 - val_f1_m: 0.9423
Epoch 74/100
36721/36721 [==============================] - 4s 99us/step - loss: 0.1924 - f1_m: 0.9340 - val_loss: 0.2065 - val_f1_m: 0.9326
Epoch 75/100
36721/36721 [==============================] - 4s 103us/step - loss: 0.1936 - f1_m: 0.9344 - val_loss: 0.1766 - val_f1_m: 0.9430
Epoch 76/100
36721/36721 [==============================] - 3s 85us/step - loss: 0.1942 - f1_m: 0.9345 - val_loss: 0.1780 - val_f1_m: 0.9420
Epoch 77/100
36721/36721 [==============================] - 3s 95us/step - loss: 0.1939 - f1_m: 0.9347 - val_loss: 0.1771 - val_f1_m: 0.9401
Epoch 78/100
36721/36721 [==============================] - 3s 78us/step - loss: 0.1933 - f1_m: 0.9345 - val_loss: 0.1778 - val_f1_m: 0.9420
Epoch 79/100
36721/36721 [==============================] - 3s 85us/step - loss: 0.1933 - f1_m: 0.9345 - val_loss: 0.1772 - val_f1_m: 0.9407
Epoch 80/100
36721/36721 [==============================] - 3s 76us/step - loss: 0.1932 - f1_m: 0.9344 - val_loss: 0.1877 - val_f1_m: 0.9375
Epoch 81/100
36721/36721 [==============================] - 3s 76us/step - loss: 0.1917 - f1_m: 0.9349 - val_loss: 0.1905 - val_f1_m: 0.9382
Epoch 82/100
36721/36721 [==============================] - 3s 75us/step - loss: 0.1932 - f1_m: 0.9349 - val_loss: 0.1795 - val_f1_m: 0.9419
Epoch 83/100
36721/36721 [==============================] - 3s 75us/step - loss: 0.1920 - f1_m: 0.9349 - val_loss: 0.1753 - val_f1_m: 0.9428
Epoch 84/100
36721/36721 [==============================] - 3s 74us/step - loss: 0.1920 - f1_m: 0.9345 - val_loss: 0.1757 - val_f1_m: 0.9434
Epoch 85/100
36721/36721 [==============================] - 3s 74us/step - loss: 0.1930 - f1_m: 0.9351 - val_loss: 0.1773 - val_f1_m: 0.9401
Epoch 86/100
36721/36721 [==============================] - 3s 74us/step - loss: 0.1926 - f1_m: 0.9347 - val_loss: 0.1859 - val_f1_m: 0.9409
Epoch 87/100
36721/36721 [==============================] - 3s 78us/step - loss: 0.1927 - f1_m: 0.9345 - val_loss: 0.1776 - val_f1_m: 0.9428
Epoch 88/100
36721/36721 [==============================] - 3s 82us/step - loss: 0.1930 - f1_m: 0.9342 - val_loss: 0.1759 - val_f1_m: 0.9414
Epoch 89/100
36721/36721 [==============================] - 3s 78us/step - loss: 0.1925 - f1_m: 0.9348 - val_loss: 0.2239 - val_f1_m: 0.9136
Epoch 90/100
36721/36721 [==============================] - 3s 76us/step - loss: 0.1922 - f1_m: 0.9345 - val_loss: 0.1851 - val_f1_m: 0.9403
Epoch 91/100
36721/36721 [==============================] - 3s 76us/step - loss: 0.1923 - f1_m: 0.9343 - val_loss: 0.1750 - val_f1_m: 0.9428
Epoch 92/100
36721/36721 [==============================] - 3s 76us/step - loss: 0.1917 - f1_m: 0.9352 - val_loss: 0.1767 - val_f1_m: 0.9404
Epoch 93/100
36721/36721 [==============================] - 3s 79us/step - loss: 0.1932 - f1_m: 0.9350 - val_loss: 0.1774 - val_f1_m: 0.9433
Epoch 94/100
36721/36721 [==============================] - 3s 77us/step - loss: 0.1921 - f1_m: 0.9345 - val_loss: 0.1831 - val_f1_m: 0.9424
Epoch 95/100
36721/36721 [==============================] - 3s 81us/step - loss: 0.1916 - f1_m: 0.9351 - val_loss: 0.1746 - val_f1_m: 0.9438
Epoch 96/100
36721/36721 [==============================] - 3s 78us/step - loss: 0.1922 - f1_m: 0.9352 - val_loss: 0.1760 - val_f1_m: 0.9409
Epoch 97/100
36721/36721 [==============================] - 3s 76us/step - loss: 0.1926 - f1_m: 0.9355 - val_loss: 0.1750 - val_f1_m: 0.9435
Epoch 98/100
36721/36721 [==============================] - 3s 83us/step - loss: 0.1920 - f1_m: 0.9352 - val_loss: 0.1835 - val_f1_m: 0.9412
Epoch 99/100
36721/36721 [==============================] - 3s 89us/step - loss: 0.1922 - f1_m: 0.9345 - val_loss: 0.1768 - val_f1_m: 0.9414
Epoch 100/100
36721/36721 [==============================] - 3s 92us/step - loss: 0.1920 - f1_m: 0.9348 - val_loss: 0.1879 - val_f1_m: 0.9403
In [83]:
# Plotting F1 Score and loss curves for train and validation data.
plt.figure(figsize=(10,12))
plt.subplot(221)
plt.title('Loss')
plt.plot(history.history['loss'],label='train')
plt.plot(history.history['val_loss'],label='validation')
plt.legend()

plt.subplot(222)
plt.title('F1 Score')
plt.plot(history.history['f1_m'],label='train')
plt.plot(history.history['val_f1_m'],label='validation')
plt.legend()

plt.show()
In [85]:
# Predicting on train and validation data
y_pred_val_ann=classifier_ANN.predict(X_val)
y_classes_val_ann=y_pred_val_ann.argmax(axis=1)

y_pred_train_ann=classifier_ANN.predict(X_train)
y_classes_train_ann=y_pred_train_ann.argmax(axis=1)

f1_ann_val=f1_score(y_val,y_classes_val_ann)
f1_ann_tr=f1_score(y_train,y_classes_train_ann)
In [86]:
# Printng F1 Scores of train and validation data 
print('Train F1-Score')
print(f1_ann_tr*100)
print('Validation F1-Score')
print(f1_ann_val*100)
Train F1-Score
48.48484848484848
Validation F1-Score
48.86363636363637

Predicting on test set

In [88]:
test_for_prediction=test.drop(columns=['employee_id','age','region'])
In [89]:
test_for_prediction.shape
Out[89]:
(23490, 19)

Decision Tree

In [66]:
prediction_DT=classifier_DT_after.predict(test_for_prediction)
In [67]:
submission=pd.DataFrame({"employee_id":test["employee_id"],
                         "is_promoted":prediction_DT})
submission.to_csv('sub_dt_tuned_trans.csv',index=False)

Random Forest

In [96]:
prediction_RF=classifier_RF.predict(test_for_prediction)
In [97]:
submission=pd.DataFrame({"employee_id":test["employee_id"],
                         "is_promoted":prediction_RF})
submission.to_csv('sub_rf_trans.csv',index=False)

Stacking

In [99]:
prediction_BCS=blended_classifier.predict(test_for_prediction)
In [100]:
submission=pd.DataFrame({"employee_id":test["employee_id"],
                         "is_promoted":prediction_BCS})
submission.to_csv('sub_bcs_trans.csv',index=False)

EVC

In [103]:
prediction_EVC=evc.predict(test_for_prediction)
In [104]:
submission=pd.DataFrame({"employee_id":test["employee_id"],
                         "is_promoted":prediction_EVC})
submission.to_csv('sub_evc_trans.csv',index=False)

Gradient Boosting

In [85]:
prediction_GB=gb.predict(test_for_prediction)
In [86]:
submission=pd.DataFrame({"employee_id":test["employee_id"],
                         "is_promoted":prediction_GB})
submission.to_csv('sub_gb_trans.csv',index=False)

XGBoost

In [95]:
prediction_XGB=xgb.predict(test_for_prediction)
In [96]:
submission=pd.DataFrame({"employee_id":test["employee_id"],
                         "is_promoted":prediction_XGB})
submission.to_csv('sub_xgb_trans.csv',index=False)

Cat Boost

In [90]:
prediction_CB=CB.predict(test_for_prediction)
In [91]:
submission=pd.DataFrame({"employee_id":test["employee_id"],
                         "is_promoted":prediction_CB})
submission.to_csv('sub_cb_trans.csv',index=False)

Results

Best F1-Score on test set achieved on XGBoost classifier which is 51.82.